The rollout of self-driving cars hits a bump in the road

Peter Els

For many years U.S. regulators have been calling for more automation in motor vehicles. By fitting systems that apply the brakes to prevent a crash, or guide a vehicle on the highway, the Department of Transport hopes to see a dramatic reduction in road accidents.

At the same time, the National Transportation Safety Board, in its investigations into incidents within the transport industry, has warned that autonomous devices may also have a down side: the technology can confuse operators if it's poorly designed or lead to complacency with its own risks.

In an uncanny twist of fate, just as the National Highway Traffic Safety Administration is about to release new guidelines for the safe deployment of self-driving cars, the technology has tragically claimed its first victim. 

Now, for the first time in a highway accident investigation, these two potentially contradictory themes will be put to the test as the NTSB opens an inquiry into the fatal accident.

"This is very significant," said Clarence Ditlow, executive director of the Center for Auto Safety advocacy group in Washington. "The NTSB only investigates crashes with broader implications."

Autopilot still in Beta testing

The Tesla Model S crash marked the first known fatality in more than 130 million miles of Autopilot driving; an event viewed by many policymakers, automakers and technology companies as inevitable. Nevertheless, the Tesla incident is certain to influence the discussion of policymakers and the public as the industry pushes deeper into automated and self-driving technologies.


Interestingly, Tesla has installed the Autopilot software on every vehicle since October 2014 even though it is still a so-called beta version. Tesla explained, in a June 30 blog post, that vehicle owners must acknowledge that the system is new technology that is "still in a public beta phase" before it will switch it on.

Eyes on the road

While several Tesla drivers have reveled in making videos of themselves using the Autopilot feature as if the vehicle was fully autonomous, it is meant as a semi-autonomous aid which does not allow the driver to abdicate control or responsibility.

 photo SelfDrivingCars_Picture1_zps6wo6olna.jpg

The May 7 fatality, which occurred while the car's semi-automated autopilot system was engaged, drove home the limitations of current automated driving systems: While Tesla's autopilot system uses cameras and radar, it’s not fitted with lidar, and under the prevailing conditions on the day would have had trouble distinguishing the white semi-trailer positioned across the road against a bright sky.

According to Stefan Sommer, CEO of German auto supplier ZF Friedrichshafen, Self-driving cars require multiple detection systems including expensive infrared "lidar" technology if they are to be safe at high speeds under inclement weather conditions.

But as the fledgling technology moves toward full autonomous driving, the more conventional automakers have designed their (ADAS) systems to take control of the car for only a few seconds at a time; the driver must be ready to resume command at any time. Which raises the question: Is it possible to get a driver to safely take back control of a car once the vehicle has started driving itself?

Which is better: Level three or level four?

While Tesla, like Google, strive for full automation, many manufacturers such as BMW, Mercedes-Benz and Volvo, have systems that use a combination of adaptive cruise control, lane-keeping and automatic braking to enable drivers to briefly take their hands off the wheel and their eyes off the road.

The National Highway Traffic Safety Administration classifies such vehicles and their limited automation as Level 2 cars. So what level should come next?

In the safety agency’s taxonomy, the next step would be Level 3, which defines vehicles that can drive on their own in specific circumstances, such as on the highway, but still require a human driver to be “available for occasional control, but with sufficiently comfortable transition time.”

But the technology giant Google disagrees, believing that meeting such a requirement is not possible and the only safe way forward is to take the driver out of the equation.

In apparent agreement with Googles stance, Volvo, whose Level 2 cars include the 2017 S90, has decided to skip the Level 3 stage. Like Google, Volvo is pursuing Level 4 cars; fully autonomous vehicles that don’t require any driver input aside from setting a destination.

In support of moving straight to level 4 Erik Coelingh, who leads Volvo’s autonomous-vehicle Drive Me research program, cites as an example: A car changing lanes at 80 Km per hour should not expect a driver to be able to suddenly take control. “Some people can take control in 10 seconds, but if someone fell asleep it could take two minutes,” he said.

Experiments conducted last year by Virginia Tech researchers and supported by the national safety administration found that it took drivers of Level 3 cars an average of 17 seconds to respond to takeover requests. In that period, a vehicle travelling at 100 Km per hour would have traveled 500 meters. 

Obviously this would constitute an unsafe driving operation, which could nonetheless be overcome by fitting video and infrared systems in the car that monitor the driver’s attentiveness. So-called electronic horizon technologies might also give drivers more time to react by “seeing” farther down the road, with cars able to automatically communicate with one another.

Still, not everyone believes the industry should go straight to level four: Manuela Papadopol, the director of global marketing at Elektrobit, a technology supplier to the auto industry, advocates a gradual evolution that lets consumers become accustomed to the technology while automakers gather more data on the systems.

Audi also supports incremental advances, rather than trying to leap ahead to fully autonomous cars. “We’re making sure the conditions are right to begin Level 3,” said Brad Stertz, Audi’s director of government affairs. “The key part is focusing on driver availability.”

Stertz said that when Audi was ready to offer Level 3 features, its system would give drivers audible and visual warnings if they appeared not to be paying attention. The system, planned for an Audi A8 in model year 2018, would at first only be able to drive itself under specific circumstances, such as in stop-and-go traffic on the highway at less than 60 Km per hour.

Will the May 7 accident change the direction of self-driving cars?

Confronted by a lack of industry standards, the National Highway Traffic Safety Administration has been working on new guidelines for testing self-driving vehicles on public roads. The question many now have is whether those recommendations will still include Level 3 vehicles and how the results may be affected by the agency’s investigation of the Tesla Model S accident.

Speaking at the National Press Club before news of the accident was made public Christopher Hart, chairman of the National Transportation Safety Board, said the first such crash would "certainly get a lot of attention, but this train has left the station" and would be unlikely to halt automated cars from hitting the road.


Company information according to § 5 Telemediengesetz
IQPC Gesellschaft für Management Konferenzen mbH
Address: Friedrichstrasse 94, 10117 Berlin
Tel: 49 (0) 30 20 913 -274
Fax: 49 (0) 30 20 913 240
Registered at: Amtsgericht Charlottenburg, HRB 76720
VAT-Number: DE210454451
Management: Silke Klaudat, Richard A. Worden, Michael R. Worden

Firmeninformationen entsprechend § 5 Telemediengesetz
IQPC Gesellschaft für Management Konferenzen mbH
Adresse: Friedrichstrasse 94, 10117 Berlin
Telefonnummer: 030 20913 -274
Fax: 49 (0) 30 20 913 240
Email Adresse:
Registereintragungen: Amtsgericht Charlottenburg HRB 76720
Umsatzsteuer- Indentifikationsnummer DE210454451
Geschäftsführung: Silke Klaudat, Richard A. Worden, Michael R. Worden