Monday, December 4, 2017

Self-driving cars decide who dies in a crash

As self-driving cars become a closer reality there are a growing number of ethical concerns. These concerns are no longer theoretical questions anymore, with estimates predicting thousands of partially-autonomous cars to soon be on the road. There is currently $80 billion invested in the industry, and the number is quickly increasing. 

However, one incredibly important questions remains: "Who dies when the car is forced into a no-win situation?"

Last year, a Daimler executive created a commotion when he was quoted as saying "its autonomous vehicles would prioritize the lives of its passengers over anyone outside the car." Later the company took a different stance by saying the exec was misquoted. It continued by saying it would be "illegal to make a decision in favor of one person and against another."

Sebastian Thrun, the man who founded Google's self-driving initiative, has said, "the cars will be designed to avoid accidents, but that “If it happens where there is a situation where a car couldn’t escape, it’ll go for the smaller thing.” But what is the smaller thing? What if the smaller thing is a toddler? These are questions that must be answered. 

At the University of California at Irvine, a study last year was conducted and the "respondents generally agreed that a car should, in the case of an inevitable crash, kill the fewest number of people possible regardless of whether they were passengers or people outside of the car." 

The American Automobile Association says three-quarters of U.S. drivers are suspicious of self-driving vehicles. Are you scared of trusting your future car to decide how to handle crashes? Would you purchase a car that could hit a child if it was best possible worst choice?

Article

2 comments:

  1. I think the idea of autonomous cars is great, but the one problem I have is the human error part. In my eyes, the only solution to self driving cars is to make it mandatory that every car on the road is autonomous. There is no way to predict human error and that is a main reason for most automobile accidents. Also, people j walk all the time and its hard for me to believe an autonomous car can predict that in a split second. It will be interesting to see how this plays out in the next couple years.

    ReplyDelete
  2. Going off of what Jack was saying, it is tough to decipher what the impact of these autonomous vehicles will have on the road and the types of accidents they are involved in with human error of vehicles on the road continuing to be a factor. Regardless, these vehicles will be transporting human beings to and from places and accidents are bound to happen. It is interesting and disturbing to think about what happens when things go wrong, but I am curious to see what happens in these scenarios once these AI vehicles are the new normal. I am hoping that whatever happens in bad situations that this new technology will do the best it can to prevent the loss of life among all parties, not just the vehicle and passengers it is transporting.

    ReplyDelete