Sign up to get full access to all our latest automotive content, reports, webinars, and online events.

Sensor fusion for autonomous driving

Add bookmark
Colin Pawsey
Colin Pawsey
08/15/2016

The automotive industry continues to speed towards autonomous driving, but there is a level of precision about the approach to what will be a revolutionary change to transportation.

One of the clear motivating factors of autonomous driving is safety. The World Health Organisation, in its publication “Global Status Report on Road Safety 2013”, estimates that there are around 1.2 million vehiclerelated fatalities and over 20 million injuries reported each year. In Europe car accidents are the leading cause of death among people aged 5-29, while worldwide the cost of car accidents is an estimated $2 trillion per year.

We have already seen huge advances in Advanced Driver Assistance Systems (ADAS), as autonomous technology is slowly integrated into production vehicles, and the trend is certain to continue with fully autonomous cars slated to enter the market over the course of the next decade. Of course, safety is of paramount importance, and manufacturers are making use of a range of different sensor technologies to enable cars to ‘take control of the wheel’. It is the fusion of these sensor technologies which will make autonomous driving a reality.

Driving Towards Autonomous Vehicles

The recent news that a fatal crash occurred in a Tesla Model S when the vehicle was in Autopilot mode, highlights the sensitive nature of the journey towards autonomous vehicles in terms of public perception and confidence.

The NHTSA is undertaking an evaluation to investigate whether the system operated correctly during the incident. According to Tesla on its official blog, the vehicle was traveling on a highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the car. Neither the driver nor the Autopilot noticed the white trailer against a bright sky so the brake was not applied.

It’s worth noting, despite the tragic circumstances, that Tesla’s autopilot system is an assist feature designed to help the driver while they keep hands on the wheel and maintain control of the vehicle. Nonetheless it is a sobering incident for the industry as a whole as the trend towards automation continues.

Fully Autonomous Cars on the Horizon

One of the key issues surrounding autonomous driving is liability in the case of an accident, and incidents such as that involving the Model S heighten the focus. Audi recently revealed that the 2018 A8 will be the first car on the road to feature level 3 autonomous technology.

Level 3 is defined as when the car can take complete control but only in specific conditions; level 4 means that the vehicle can assume full control in all but the most extreme conditions, such as severe weather; while level 5 is the ultimate goal whereby the car can take complete control in all situations - even without a human on board.

The Audi A8 will feature a traffic jam pilot which will enable a driver to take their eyes off the road in traffic at up to speeds of 60km/h. This is a step further than current systems which require the driver to maintain concentration on the road. Audi confirmed that it would accept liability if the system failed, but it isn’t intended to allow the driver to completely switch off. The system will monitor the driver to ensure they are ready to take control if instructed to do so, and will even bring the car to a stop if it detects a lack of regular movement.

To achieve the requirements of level 4 and level 5 autonomy, further system developments will be required. To this end BMW announced a partnership in early July with Intel and Mobileye to bring autonomous technology to production vehicles by 2021.

The three companies will together develop the systems for level 4 and level 5 autonomous driving, and lay the foundations for self-driving fleets in the future. Milestones have been set to deliver a fully autonomous car based on a common reference architecture, and near term the partners will demonstrate an autonomous test drive with a highly automated driving (HAD) prototype. In the longer term the BMW iNext model for 2021 will provide the platform for the group’s autonomous driving strategy.

Mobileye provides expertise in sensing technology, while Intel will contribute the computing technology for processing vast amounts of data. The processing of sensor data, such as the capability of understanding the driving scene through a single camera, will be deployed on Mobileye’s latest system-on-chip, the EyeQ®5, and the collaborative development of fusion algorithms will be developed on Intel computing platforms. Mobileye’s Road Experience Management (REM) technology will also be implemented to provide real-time precise localization and model the driving scene.

Sensor Technology

A recently revealed Jaguar Land Rover project to develop autonomous technology for off-road driving offers a valuable insight into the integration and fusion of various sensing technologies.

The research project aims to make JLR’s self-driving cars viable in a wide range of on-road and off-road conditions. New surface identification and 3D path sensing systems use camera, ultrasonic, radar and lidar sensors to give the car a 360-degree view of the environment around it. JLR says that the combined power of the sensors is so advanced that the car is able to determine road surface characteristics, down to the width of a tyre, even in adverse conditions such as falling rain or snow.

Ultrasonic sensors identify surface conditions up to five metres ahead of the vehicle so the Terrain Response System can be automatically changed before the car moves from one surface to another. For off-road driving the 3D path is critical to avoid overhead obstructions such as overhanging branches, and an Overhead Clearance Assist system uses stereo camera technology to scan ahead for any such obstructions. The system allows a driver to pre-program the vehicle height, including roof boxes or bicycles, and the car will warn the driver if there is insufficient clearance ahead.

Another technology being investigated as part of the research is Terrain Based Speed Adaption (TBSA), which uses cameras to detect bumpy terrain, uneven surfaces and standing water. It predicts the potential impact of such hazards and adjusts speed to mitigate the effect on passengers.

Sensor Fusion

There are many different types of sensing technology being developed, but the key issue is the fusion of these technologies into a platform that enables the vehicle to accurately assess the world around it and drive accordingly.

NXP

Earlier this year automotive supplier NXP semiconductors unveiled its BlueBox, which handles sensor fusion, analysis and complex networking. The BlueBox will act as a central controller, collecting input from several sensors and stitching it together for analysis. The controller will classify other vehicles, pedestrians and objects before determining how they will affect the car’s movement.

The BlueBox utilizes two main processors to fuse inputs and make decisions. A networking device handles communications, while a safety controller combines inputs from cameras, radar, lidar, and vehicle-to-vehicle communication. The box runs at 90,000 DMIPS (million instructions per second), but crucially draws less than 40W. The safety controller has four microprocessor cores and hardware accelerators, while the networking chip is even larger with eight cores. The level of computing power and the shift to Ethernet demonstrates the high volume of data that will flow into a centralised controller from a variety of sensor inputs.

Nvidia

Another software platform designed to enable sensor fusion is the NVIDIA DRIVE PX system; a joint solution developed by Elektrobit, Infineon and NVIDIA. The system consists of the NVIDIA DRIVE PX self-driving car computer integrated with EB’s AUTOSAR 4.x-compliant EB tresos software suite, which runs on the NVIDIA Tegra and AURIX 32-bit TriCore microcontroller from Infineon. The combination of powerful computing processors with hardware and software enables safety-critical ADAS functions for self-driving vehicles, while the EB tresos software provides seamless integration capability of Linux and AUTOSAR applications.

DRIVE PX can fuse data from 12 cameras, as well as lidar, radar and ultrasonic sensors. This allows algorithms to accurately understand the full 360 degree environment around the car to produce a robust representation, including static and dynamic objects. The use of Deep Neural Networks (DNN) for the detection and classification of objects dramatically increases the accuracy of the resulting fused data. The software platforms are built around deep learning and include a powerful framework (Caffe) to run DNN models, designed and trained on NVIDIA DIGITS; while the system also features an advanced computer vision library and primitives. Together, the technologies provide a highly accurate combination of detection and tracking. The platform offers the architecture for automakers and Tier 1 suppliers to develop self-driving applications faster and more accurately. Key features include dual NVIDIA Tegra X1 processors which deliver a combined 2.3 Teraflops; interfaces for up to 12 cameras, radar, lidar and ultrasonic sensors; rich middleware for graphics, computer vision and deep learning; and periodic software/OS updates.


RECOMMENDED