Safety-critical testing of human controlled software-based systems

Peter Els

Active safety systems need to perform their role in a user-friendly manner and continue to function smoothly despite a panicking driver‘s instinctive reactions to unfolding events during an emergency maneuver.

And because traffic safety has been a primary component of the promotion of automated driving, the unpredictable human intervention in an autonomous system could seriously compromise this objective.

Over the years, a variety of advanced simulation tools have evolved to study this human machine interaction; most prominently driving simulators, or more precisely, Driver-in-the-Loop simulators. These tools allow human drivers to interact in real time with vehicle simulations, to perform “virtual test drives” in a laboratory setting during the product development stages.

One step away from actually road testing a vehicle in the real world, a simulator affords engineers the opportunity to test active safety systems in the security of a simulated environment, with a human driver in control.

Unlike entertainment simulators, which are designed primarily for the enjoyment of the player/driver, Human Factors DIL simulators are designed to more closely simulate actual driving conditions of real vehicles, and as such, they are useful for vehicle manufacturers conducting engineering development. The emphasis is typically directed towards monitoring drivers rather than vehicle behaviors.

Using a simulator to accurately depict critical driving scenarios lets the developer bridge the gap between subjective targets and objective measures during the early development phases by measuring the driver’s reaction to random environmental inputs such as cyclists and pedestrians.

One such driving simulator at the University of Warwick, in Britain, has been specifically designed to test “intelligent” vehicles. It can thus interact with the sensors of an autonomous car and put such a vehicle through its virtual paces without ever seeing a road.

The car to be tested sits in the middle of the simulator, which projects a 360° high-definition image of the vehicle’s virtual surroundings, constructed from digital maps of 48km of roads in and around the nearby city of Coventry, together with infrastructure and scenery.

The simulator can depict virtual traffic, cyclists, pedestrians and even dogs crossing the road, all of which programmers can control. It also features surround-sound and actuators that move the vehicle as it would when accelerating, braking or cornering. Even the thump of a virtual pothole can be created.

Camera-based systems typically use a form of artificial intelligence, known as machine vision, to analyse the shapes of objects. But in extreme conditions, such as when cameras succumb to a condition known as “washout”, typically caused by the glare of bright ambient light which occurs at sunrise and sunset, the system may incorrectly identify road hazards, says Paul Jennings, head of experimental engineering at Warwick. Unlike the real world, hundreds of sunrises and sunsets can be created in the simulator every day, thereby reducing the development time of antiglare systems. Other visible hazards that might be hard for self-driving cars to manage, such as streets crowded with pedestrians, cars jumping red lights, joggers suddenly running into the road, can also be created at will in the simulator without endangering anyone.


Because autonomous driving systems typically use ultrasound, radar or Lidar sensors it’s important that these are fully evaluated in any simulation. The researchers at Warwick can even choose to bypass sensors and feed in simulated signals from a computer model, to simulate sensor malfunctions.

Besides testing a car’s hardware and software, Dr Jennings’s simulator will also test its “wetware”; the humans who are being transported. As part of the development there are plans to invite members of the public to become drivers and passengers, to build up a library of typical human reactions to hazardous situations.

The idea is to use gaze-monitoring and cameras inside the vehicles to find out how drivers respond to certain situations. Of particular interest will be the study of how quickly a driver realises that a dangerous situation is developing before taking back control of the car. This is important as there is ample evidence that many drivers put too much trust in machines, and therefore become complacent while at the wheel.

Although many institutions make use of simulators during vehicle development, the Warwick equipment is unique in that V2V and V2X performance can also be evaluated. Since the integrity of the signals involved will be paramount, Dr Jennings’s machine can simulate what happens when contact is degraded or shut off; for example, when a vehicle enters a tunnel or a city “canyon” of tall buildings.

To accomplish this the simulator is encased in a giant Faraday cage, formed from a mesh of materials that block or impede signals. This both insulates it from outside interference and enables the signals that are required inside it to be created and controlled accurately.

Unfortunately the cost of such a state-of-the-art simulator can be prohibitive and therefore many smaller talented start-ups are often excluded from developing cutting edge solutions.

To solve the problem the Driving Simulator and Vehicle Systems Lab (SimLab), in San Jose, has recently announced it will provide access to research-grade simulator facilities to startups developing advanced automotive technologies. With the stated objective of helping smaller enterprises to compete, SimLab will rent time on the simulator to develop safety and efficiency improving technologies that can then be marketed to auto manufacturers or industry suppliers.

“It’s open to everyone, and that’s new. That’s the reason why we did this,” says Dr. Lutz Eckstein, chairman of the board for fkaSV, the company running SimLab. Simulators are usually the property of automakers, companies like Google, or research centers at universities. But, Eckstein says, “any startup or any company that is interested in presenting or evaluating or optimizing their technology can now test their innovation on a world class simulator.”

Eckstein anticipates the SimLab will foster advancements in automated driving. He calls it “creating a new driving experience.” In order to achieve this, developers will have access to a simulator that feeds in traffic data and driving scenarios with an immersive wall of screens that wraps 220 degrees around a stationary BMW 6 Series.

As human test drivers navigate artificial traffic, everything they do will be logged for analysis. That way, researchers can refine their products based on detailed user feedback.

Interestingly Eckstein believes that level three and four phases are important in the evolution of self-driving cars. “In order to create trust in fully autonomous cars, you need to communicate to the driver what the car is going to do.” Whether that’s through new screens (HUDs), audio warnings, seat vibrations, or anything else, it needs to be tested first. That’s the intention of SimLab. “It’s kind of like pretend control that makes us feel better about not being the ones in charge—but it’s crucial.”

And for cars to take charge they not only need to communicate with the drivers, but also with each other and the infrastructure around them; problem is, in this early phase of development not many connected cars are on the road and connected infrastructure is virtually non-existent, although in North America V2X may be rolled out for cars as early as 2019.

So how can these systems be validated? What steps can be taken to ensure that, as precursors of autonomous driving, connected ADAS systems will work in every country – not just on well laid out German highways, but also in chaotic megacities?

To satisfy these demands transportation authorities around the world are busy developing and testing innovative methods to enhance public safety and transportation efficiency that rely on ten real-time messages per second between vehicles within a half-mile radius to give drivers advanced warnings of hidden danger. This can amount to millions of messages from cars in congested areas in just a few minutes.


Company information according to § 5 Telemediengesetz
IQPC Gesellschaft für Management Konferenzen mbH
Address: Friedrichstrasse 94, 10117 Berlin
Tel: 49 (0) 30 20 913 -274
Fax: 49 (0) 30 20 913 240
Registered at: Amtsgericht Charlottenburg, HRB 76720
VAT-Number: DE210454451
Management: Silke Klaudat, Richard A. Worden, Michael R. Worden

Firmeninformationen entsprechend § 5 Telemediengesetz
IQPC Gesellschaft für Management Konferenzen mbH
Adresse: Friedrichstrasse 94, 10117 Berlin
Telefonnummer: 030 20913 -274
Fax: 49 (0) 30 20 913 240
Email Adresse:
Registereintragungen: Amtsgericht Charlottenburg HRB 76720
Umsatzsteuer- Indentifikationsnummer DE210454451
Geschäftsführung: Silke Klaudat, Richard A. Worden, Michael R. Worden