Sign up to get full access to all our latest automotive content, reports, webinars, and online events.

The Only Way is Ethics for Self-Driving Cars

Add bookmark
Tom Phillips
Tom Phillips
03/20/2019

Deciding exactly how a self-driving car should behave in an emergency raises a host of ethical questions which remain unanswered

In his TED-Ed talk, Patrick Lin poses the challenging thought that the deaths of both passengers and pedestrians have already been hard coded into the ECUs of an autonomous car.

As the highly automated vehicle is in control, its reaction is no longer exactly that – an instinctive response to an emergency situation processed by the driver. Instead, it’s a coded, pre-meditated decision to take the ‘least worst’ course of action.

The arrival of the self-driving car has already brought some spectacular innovations and serious progress – we’ve come a long way since the 2004 DARPA challenge that saw 15 teams compete in California’s Mojave Desert with early autonomous cars. The ‘winner’ completed just 11.78km of the 240km course, a far cry from the capabilities of Waymo’s fleet of automated Chrysler Pacifica minivans, for instance.

The engineering has come on in leaps and bounds, so we’re fully aware of the safety and environmental benefit that self-driving cars are likely to bring – if you’re reading this, you’re probably just as excited as us for what’s around the corner.

However, the issue of ethics – and exactly who should take responsibility for what’s right and wrong in a given situation – remains an area that’s up for discussion, and one that shouldn’t be downplayed just because the technology has the potential to drastically reduce accidents. Humans have never allowed an artificially intelligent machine to take the decision as to whether a person lives or dies, and cars are likely to become the first to do it – so ethics can’t be ignored.

MIT asks for help in learning how to make machines moral

Perhaps the most well-known study into the ethics of artificial intelligence was developed by MIT. It created the Moral Machine, “an online experimental platform designed to explore the moral dilemmas faced by autonomous vehicles.”

The machine is still online, but for the purposes of the paper published in the journal Nature in 2018, the data included “40 million decisions in ten languages from millions of people in 233 countries and territories.”

It certainly doesn’t seek to cover every ethical dilemma, but the volume of respondents mean the insights of the study are worth noting. For example, it highlighted a significant complication of the ethics of autonomous vehicles – that people’s decisions effectively ruling who should live or die vary across the globe.

So while today’s engineers might develop a specific NVH tune for different markets, they could also tweak algorithms that decide whether the old or the young; fat or thin pedestrian takes the hit. Given advances in in-car sensor tech, the passengers of the vehicle aren’t immune from such decisions either, as the vehicle could decide that you’re the one who has the least to contribute to society.

Euro NCAP adds autonomous to its safety roadmap

Euro NCAP states that around 90 percent of road accidents are attributable to driver error. Autonomous cars are set to have perhaps the most significant impact on those safety stats. Its roadmap document, published in 2017, sets out how self-driving technology will be included in the assessment in the future.

However, it’ll be some time coming, with the organization confirming that autonomous testing will be kept outside of star ratings. Instead a gradation system is likely to be introduced, based on the use cases of parking, city driving, inter-urban driving, traffic jams and highway driving.

Could that mean in future, buyers could choose a car with technology that favors the protection of its occupants, rather than pedestrians? Arguably, that’s already a direction buyers have moved towards, whether consciously or not. Do I choose a big, sturdy-looking SUV to protect my family, or a smaller, economical hatchback? At 861,331 units, VW’s top-selling car in 2018 was the Tiguan, not the Golf (805,752) with which it shares a large number of common components. Fashion is one thing, but the perception of safety is a significant contributor to that buying decision.

Interestingly, while there’s no mention of ethics in the document, Euro NCAP does say that it will include “information from user studies in the assessment process” for the first time. The input will certainly help ensure that the autonomous features of vehicles are understood, although it’s unclear whether this will extend to its ethical programming in a life-versus-life decision.

Exactly who makes the ethical decision?

It’s easy to over-simplify the debate around self-driving car ethics to just ‘program the car to follow the established rules of the road.’ The argument being that, in doing this, you remove the burden of responsibility for any ethical decisions for the engineers who coded its behavior. Additionally, it’s possible to believe that the rapid roll-out of technology such as car-to-x as part of the self-driving package should ensure that vehicles and pedestrians will come into contact increasingly less often.

However, getting the self-driving car to follow the rules is rarely the problem. Instead, it’s the grey space in which just about every real-world driver today operates that’s the tricky part. Waymo, as every other company with a permit to test self-driving vehicles on Californian roads, publishes its accident stats, with over 30 accidents reported. Frequently, it’s the self-driving test car that gets rear-ended for its strict adherence to the rules, rather than interpreting the ‘grey-space’ situation like a human, that causes the issue.

Likewise, the issue of ethics is frequently only discussed as a matter of life or death. If a car is able to decide that its best course of action is causing someone potentially life-changing injuries, that is significant. Also to consider is the psychological trauma caused by the car’s decision, both to passengers and pedestrians, and whether this is magnified by the fact that a machine chose how to mitigate the impacts – literal, physical and mental – rather than a human.

Concluding the ethical conundrum

When you look at the potential safety benefits self-driving vehicles could bring, it’s easy to wonder whether we should be getting so caught up in ethical decisions at all. When it comes to choosing between saving a small number of lives, when autonomous vehicles have the potential to cut the number of people hurt on the roads by vastly greater numbers, surely that’s simply a price worth paying?

The response to the fatal crash in Florida involving a Level 3 autonomous Volvo equipped with Uber’s self-driving sensors shows how that opinion might be factually correct, but hard to justify to an end consumer. If customers have no confidence in the decisions that artificial intelligence makes on their behalf, the self-driving vehicle revolution can only reach part of its full potential.

A further ethical question is that if the potential is reached, it will have to be applied the other way, too – if self-driving cars are much safer, then the ethically correct choice to forbid human drivers is surely one that will have to be made at some point?

As noted in Vice’s Motherboard blog, “‘I’m just an engineer’” isn’t an acceptable response to ethical questions. When engineered systems allocate life, death and everything in between, the stakes are inevitably moral.”

The role and responsibility of ethical issues surrounding autonomy may still be very much under development, and few solutions have been made public. Germany is so far the only country to publish guidelines on ethics in the development of self-driving cars. But whatever happens, decisions being made in programming today’s autonomous cars will become the new rules of the road. These rules have the potential to deliver the most significant societal change catalyzed by self-driving cars. Especially when it comes to matters of life and death.

  • Comments, questions, concerns? Are you employed in the field of ethics and autonomous cars? Feel free to weigh in in the comments section below...

RECOMMENDED