Self-driving cars raise safety, ethical concerns

1521

 

Editor’s note: this story pairs with “Utah: the perfect place for self-driving cars”

A self-driving Uber car killed a woman March 19 in Arizona, raising questions about the safety and ethics of autonomous vehicles. The crash is the first known fatal accident caused by an autonomous vehicle.

Autonomous vehicles are predicted to be available in the next decade and the standard for most new vehicles by 2060, according to a Governors Highway Safety Association report. A 2016 Utah Department of Transportation report called these changes in transportation “the most disruptive changes since the invention of the automobile.”

Companies such as Google, Uber, Tesla, Toyota and General Motors are racing to create the first autonomous vehicle produced on a mass scale.

But in order for autonomous vehicles to show their reliability, they would need to have driven more than 100 million miles, which could take over 100 years, according to a ScienceDirect journal article.

However, the National Highway Traffic Safety Administration website says autonomous cars have the “potential to save lives and reduce injuries” because “94 percent of serious crashes are due to human error.”  

“Fully automated vehicles that can see more and act faster than human drivers could greatly reduce errors, the resulting crashes and their toll,” the website says.

Utah Department of Transportation technology and innovation engineer Blaine Leonard has been involved with autonomous vehicle technology for more than three years. When asked about the safety of potentially having autonomous vehicles and manually driven cars on the same road, he said the mix is already happening.

Autonomous vehicles are categorized on a scale of level zero to level five, with level zero being no automation and level five being full self-driving without a human driver or occupants. Leonard said a Tesla, with its on-board sensors and lane positioning, would be considered a level two.

Leonard said autonomous features tend to be more cautious than human drivers because humans make riskier decisions. However, in some situations breaking the law can seem necessary, such as crossing over a double yellow line to give space to a parked car. Autonomous cars, if programmed to follow all laws, would have to stop and wait for that car to move in order to continue driving.

“What the autonomous vehicle manufacturers have been doing is trying to mimic human drivers as much as possible, but they generally have a tendency to react a little slower in those kinds of situations and be a little more cautious about nosing into traffic or breaking into a line of cars,” Leonard said.

Rep. Robert Spendlove, R-Sandy, sponsored an autonomous vehicle bill earlier this year that would have allowed autonomous vehicles on Utah roads. The bill didn’t pass. According to Spendlove, limited testing is currently taking place in Utah on private roads, property and tracks.

The Sustainable Electrified Transportation Center, led by Utah State University, uses a closed loop test track to test some of its autonomous systems. Executive Director David Christensen said the center has an electric Ford Focus the students have outfitted with a visual reading system so it essentially can drive itself by identifying the white lines on either side of the track. Christensen said this visual reading is used alongside other technologies in many autonomous vehicles

“It’s a combination of different types of cues or data sharing of the natural environment it’s operating within, and then a set of algorithms helps (the autonomous vehicle) figure out what it needs to do next,” Christensen said.

Christensen said USU is also researching how to prevent hacking into the systems of autonomous vehicles and how to deal with autonomous systems in urban canyons where GPS signals can be blocked.

A unique element of autonomous vehicle safety is how all actions would hypothetically be programmed into the vehicle before any accident actually occurred. 

According to a 2016 study, most people approve of automated vehicles that would “sacrifice passengers to save others” and would like others to buy them; however, these same individuals would prefer to ride in automated vehicles that protect their passengers at all costs.

Something called the trolley problem illustrates the conundrum of who to protect.

The trolley problem is a hypothetical situation in which a train is headed toward five people tied to the tracks. Pulling a lever would switch the train to a different set of tracks where one person is tied up. The question is if it would be more ethical to do nothing and let the train kill five people or to pull the lever and let the train kill one.

Eindhoven University of Technology assistant professor Sven Nyholm and doctoral student Jilles Smids explained how autonomous vehicles complicate the ethics of the trolley problem even further in a 2016 article.

Nyholm and Smids gave an adapted scenario in which a truck suddenly appears in the path of an autonomous vehicle carrying five passengers and the only way to save the five is for the car to swerve onto the sidewalk and kill an elderly pedestrian.

According to the article, this situation differs from the trolley problem because in the trolley problem, a single individual must make an immediate decision with limited knowledge. In the autonomous vehicle scenario, the car would have been programmed to respond to this accident by a group of people with virtually unlimited time and resources.

Also, morality and legality are not considered in the trolley problem. With autonomous vehicles, this is an important element to figure out, according to Nyholm and Smids.

We must treat it as a question of what those who sell or use self-driving cars can be held responsible for and what the society that permits them on its roads must assume responsibility for,” Nyholm and Smids wrote.

Last, in the trolley situation, the outcomes are certain; either the five people will die or the one person will die. In the autonomous vehicle situation, the risks are estimated and the results are never guaranteed. For example, the truck driver might try to prevent the accident by steering back or braking, or the trajectory of the elderly pedestrian or the car itself could be miscalculated. In these situations, the car would be responding to assumptions or estimates.

Print Friendly, PDF & Email