Honolulu Star-Advertiser

Friday, May 10, 2024 76° Today's Paper


Top News

For driverless cars, a moral dilemma: Who lives or dies?

1/1
Swipe or click to see more

ASSOCIATED PRESS

An autonomous vehicle was driven by an engineer on a street through an industrial park last week in Boston.

BOSTON >> Imagine you’re behind the wheel when your brakes fail. As you speed toward a crowded crosswalk, you’re confronted with an impossible choice: veer right and mow down a large group of elderly people, or veer left into a woman pushing a stroller.

Now imagine you’re riding in the back of a self-driving car. How would it decide?

Researchers at the Massachusetts Institute of Technology are asking people worldwide how they think a robot car should handle such life-or-death decisions. Their goal is not just for better algorithms and ethical tenets to guide autonomous vehicles, but to understand what it will take for society to accept the vehicles and use them.

Their findings present a dilemma for car makers and governments eager to introduce self-driving vehicles on the promise that they’ll be safer than human-controlled cars. People prefer a self-driving car to act in the greater good, sacrificing its passenger if it can save a crowd of pedestrians. They just don’t want to get into that car.

“There is a real risk that if we don’t understand those psychological barriers and address them through regulation and public outreach, we may undermine the entire enterprise,” said Iyad Rahwan, an associate professor at the MIT Media Lab. “People will say they’re not comfortable with this. lt would stifle what I think will be a very good thing for humanity.”

After publishing research last year surveying U.S. residents, Rahwan and colleagues at the University of Toulouse in France and the University of California, Irvine, are now expanding their surveys and looking at how responses vary in different countries.

They are also using a website created by MIT researchers called the Moral Machine , which allows people to play the role of judging who lives or dies. A jaywalking person or several dogs riding in the driverless car? A pregnant woman or a homeless man?

Preliminary, unpublished research based on millions of responses from more than 160 countries shows broad differences between East and West. More prominent in the United States and Europe are judgments that reflect the utilitarian principle of minimizing total harm over all else, Rahwan said.

The thought experiment is familiar to ethicists, who have grappled with such dilemmas since British philosopher Philippa Foot in the 1960s presented a similar question about a runaway trolley . But it’s too unrealistic for some focused on how the vehicles act in ordinary situations.

Just 5 miles from MIT’s Media Lab in Cambridge, the first self-driving car to roll out on Massachusetts public roads began testing this month in Boston’s Seaport District.

“We approach the problem from a bit more of a practical, engineering perspective,” said NuTonomy CEO Karl Iagnemma, whose Cambridge-based company has also piloted self-driving taxis in Singapore.

Iagnemma said the study’s moral dilemmas are “vanishingly rare.” Designing a safe vehicle, not a “sophisticated ethical creature,” is the focus of his engineering team as they tweak the software that guides their electric Renault Zoe past Boston snowbanks.

“When a driverless car looks out on the world, it’s not able to distinguish the age of a pedestrian or the number of occupants in a car,” Iagnemma said. “Even if we wanted to imbue an autonomous vehicle with an ethical engine, we don’t have the technical capability today to do so.”

Focusing too much on the stark “trolley problem” risks marginalizing the study of how best to address self-driving ethics, said Noah Goodall, a scientist at the Virginia Transportation Research Council. Engineers already program cars to make moral choices, such as when they slow down and leave space after detecting a bicyclist.

“All these cars do risk management. It just doesn’t look like a trolley problem,” Goodall said.

Rahwan agrees with self-driving enthusiasts that freeing vehicles from human error could save many lives. But he worries that progress could be stalled without a new social compact that addresses moral trade-offs.

Current traffic laws and human behavioral norms have created “trust that this entire system functions in a way that works in our interests, which is why we’re willing to fit into large pieces of metal moving at high speeds,” Rahwan said.

“The problem with the new system it has a very distinctive feature: algorithms are making decisions that have very important consequences on human life,” he said.

4 responses to “For driverless cars, a moral dilemma: Who lives or dies?”

  1. noheawilli says:

    It’s simple, create a better brake. How many of any live drivers face such dilemmas? Just make a better brake.

  2. livinginhawaii says:

    Here’s one – so what if you decide to mow down the elderly people then find out the woman with the stroller is Johnny Knoxville in drag filming an episode? Put that one through the moral machine.

  3. tigerwarrior says:

    Here’s another scenario: Say you’re in a driverless car entering an intersection while the light is green. In the meantime, there is a driver entering the same intersection perpendicular to your car driving through a red light at 50 miles per hour. Will the driverless car have the “presence of mind” to stop on a dime, stopping long before that vehicle approaches despite having the right of way, or will it only stop when the vehicle running the red light comes a bit closer, which may be too late. I’ve been in a situation where the light was green when I noticed from the side of my eye as well as my ears, a driver of a car a city block away, gunning his engine, going twice the speed limit, then barreling through a pure red light at 50 mph. I refused to enter the intersection even though I clearly had the right of way. Would a driverless car be inclined to do the same?

    • Waterman2 says:

      Who is responsible when one sensor fails and the car goes thru a crowded crosswalk ? In normal car it would be the driver. So in the driverless is it the manufacturer? The owner , the contracted company supplying the sensors to the manufacturers ?

Leave a Reply