AI: A Game-Changer for the Automotive Industry?

James Gordon

It is the world’s technology capital – a beating heart of innovation – but now Silicon Valley, a Californian tech hub, home to many of the world’s most innovative software and video game companies, has its eyes firmly fixed on game-changing next generation Artificial Intelligence. 

Take NVIDIA for example. It believes that AI supercomputers with deep learning software will have a transformative effect on the global automotive landscape.

Last month in Las Vegas, at CES 2017, the world’s largest innovation event, NVIDIA's charismatic CEO, Jen-Hsun Huang, dressed in his trademark black leather jacket, announced that the Fortune 1000 company was working in collaboration with Mercedes-Benz to release vehicle to the market by within one year powered by NVIDIA. The AI computing company also confirmed that it is working alongside OEMs such as Audi and Tier 1 giants Bosch and ZF, as well as global mapping providers, HERE and Zenrin.

If you are wondering why a company founded in 1993, and most famous for developing pioneering hardware and software to power video games, is now creating deep learning systems for the automotive industry, you are not alone. 

But Danny Shapiro, NVIDIA’s Senior Director of Automotive says “NVIDIA’s cooperation with the car industry is nothing new”.


“Some find it surprising that self-driving vehicles are getting their smarts from a technology originally developed to play video games. But our cooperation is more of a logical step forwards than a technological leap into the unknown.”

Continues Shapiro, “Our strong professional graphics heritage means that essentially every auto maker in the world already uses NVIDIA technology to design and style cars. And in fact, NVIDIA has been supplying in-vehicle technology to automakers for around a decade now. Our processors are used by manufacturers including Audi, BMW, Bentley, Porsche, Honda and Tesla to power infotainment and digital dashboard systems. Today, there are well over 10 million cars on the road with NVIDIA hardware and software inside,” he adds.

Breaking new ground

The NVIDIA DRIVE PX 2 platform, though, marks a new chapter in this trailblazing Santa Clara-based company’s history. And according to IHS Technology, the next year could be could be as bountiful as the local farms, ranches and orchards once were for the San Francisco Bay Area. The global information company has forecast that consignments of AI systems could number 122 million by 2025 – quite astonishing when you consider that there were only seven million shipments in 2015.


The key to this ground-breaking technology system - the Graphics Processing Unit (GPU) - which Shapiro describes “as a tiny piece of silicon comprised of billions of transistors which punches well above its weight”, was first invented to power 3D video games.

Shapiro, who joined NVIDIA in 2009 and serves on the advisory board of the Connected Car Council, explains, “Several years ago, researchers discovered that the same parallel architecture designed to handle the vast amount of video data required for 3D graphics was also a perfect fit for the complex parallel computing required by deep learning.”

Continues Shapiro, “This form of artificial intelligence enables computers to learn from data and write software that is far too complex for people to code. NVIDIA recognised the opportunity presented by this affinity between deep-learning and the GPU. Since then we’ve been investing in a new AI computing model (GPU-accelerated learning) which is helping to create computers, robots and self-driving cars that can perceive and understand the world better than humans can.”

At the heart of this breakthrough AI technology is a concept known as ‘deep learning’. Shapiro, who holds a Bachelor of Science in electrical engineering and computer science from Princeton University tells me, that the key to this leading-edge tech, is that it allows computers to perform tasks for which they have not been specifically programmed.

Says Shapiro, “It’s revolutionary because it lets computers teach themselves about the world through a training process that’s roughly similar to the way that children learn. Large amounts of training data are fed through sophisticated algorithms to build Deep Neural Networks (DNN), which are complex mathematical models which mimic the brain. What was recently science-fiction is now reality.”

Shapiro, who also holds an MBA from the Haas School of Business believes that “AI’s effect on the automotive industry will be transformative”. 

At the recent CES 2017 exhibition, NVIDIA launched AI Co-Pilot – an AI technology enhancement aimed to keep drivers, passengers and pedestrians safe.

Says Shapiro, “Over a million people worldwide are killed in auto accidents every year, the vast majority caused by driver error… We believe that even when the car isn’t driving for you, it should be looking out for you. AI within the car can sense your surroundings, monitor your behaviour within the car and help you to avoid potential hazards. That’s the thinking behind AI Co-Pilot.

Artificial Intelligence: How it can make a difference

Shapiro then demonstrates how this breakthrough technology actually works in practice.

Explains Shapiro, “If there is a potential hazard ahead such as a cyclist turning into the road ahead of you, AI Co-Pilot will receive the information from the car’s sensors. The equipped vehicle will then send an audible warning to the driver, and could also potentially take over control of the vehicle to prevent an accident if you do not respond.

“But the system also tracks driver head position and eye-movement and so if it senses that the motorist has detected the pedestrian who is about to step into the road, then the audible alert would be suppressed.”

But what is the science behind this state-of-the–art intelligent system?

“The science is twofold. One is the access to huge amounts of information now available at the click of a mouse. And the second is the computing horsepower that was once a dream only a decade ago.  To give you an idea of how powerful the DRIVE PX platform is, our most recent iteration, called Xavier, which was debuted at CES 2017, will be able to process 30 TFLOPs at just 30 watts. That’s 30 trillion operations of per second which only consumes the energy of a conventional lightbulb.” 

Shapiro added, “That is really quite staggering when you consider that in the year 2000, the most powerful super computer in the US was only able to process 1 TFLOP, consumed 500,000 watts, and was the size of a small house. Fast forward 17 years later and we have something 30X stronger, and only the size of a thumbnail.” 

NVIDIA’s heavy hitting processing power coupled by their end-to-end offerings makes self-driving cars feel less like science fiction, and more similar to just science. 

“The small but densely-packed intelligent silicon microchip has been programmed to process data, react to experiences and learn from them in real-time – that’s 30 times a second, every second - in the same way a driver would at the wheel. 

Secondly, the AI system is also closely integrated to the V2V and V2X landscapes and so the car can learn from other drivers, vehicles and stop signals too.”

Deep Learning: The Lifeblood of AI

But can cars really learn human behaviour, and, if so, how do they do so, and what computer systems has NVIDIA created to underpin and facilitate the ‘deep learning’ process? 

Shapiro explains that the ‘deep learning’ process is divided into two separate stages. The training phase takes place in the data centre, while the inferencing stage occurs on the road.

Says Shapiro, “During the first phase, data scientists will feed massive amounts of data into the network and supervise its development. This is known as the training phase. For example, the more stop signs you use to train the network, the better it gets at identifying them correctly. Eventually, when the network achieves a very high confidence in identifying traffic lights accurately, it’s ready to be deployed in the vehicle. 

And the high-tech computer systems being employed? 

Shapiro says that it is the DRIVE PX platform and the DriveWorks operating system that form the bedrock of the system architecture. 

“They process vast amounts of information, run extremely rich AI networks and algorithms, and perform deep learning operations in real-time to ensure the safety of the driver and fellow road users,” he explains.

But the technology is multi-layered and running alongside the DRIVE PX and DriveWorks bases, Shapiro confirmed “a series of multiple neural networks” which he explains are the artificial intelligences to handle different aspects of driving such as object detection, lane detection, localization and path planning.

And with Level 4 autonomous driving vehicles set to enter production by 2020 technology fast becoming a reality, Shapiro believes that “AI has the power to make the self-driving car revolution a reality”.

“The seeds of full scale autonomous driving are already being seen in car models today,” he says. “At CES, we announced a series of new partnerships to build out our platform. We’re working closely with Audi to ensure that it’s Level 4 autonomous vehicle will be on the market in three years’ time. And Bosch and ZF, two of the largest tier 1 suppliers in the world, have already adopted the DRIVE PX platform.”

AI: The Challenges Ahead

But there are still hurdles to be overcome. One barrier to entry, for example, is certification. Currently no standards exist and Luca De Ambroggi, a Principal technology analyst, automotive semiconductors, working for IHS Markit, says that he is “not aware that any ISO body is in the process of creating an AI standard”.

And Egil Juliussen, Director of Automotive Research for IHS Markit, says, “There are concerns about how to certify more complex systems such as driverless car control. The concern is that the deep learning system is a ‘black box’ that provides solutions, but exactly how the decisions are made is not available and cannot be analysed – at least for now.”


Both believe that “it will take some time to synchronise deep learning platforms with ADAS systems”. 

De Ambroggi explains, “It (AI technology) is not mature enough yet in terms of safety/certification. Nor is there enough silicon to support Level 4-5 performance requirements. However, that said, I expect this might change over the course of the next two years in Silicon Valley and the State of California.

But, as a tool to drive continuous improvement, OEMs, tier 1 and drivers could reap the benefits a lot sooner.

Mark Boyadjis, who has been analysing the impact of automotive UX trends and infotainment systems since 2008 and is principal analyst and manager for IHS Markit in this area, says, “AI platforms will incrementally advance the underlying drivetrain intelligence, even after the vehicle is in the hands of its owner (or subscriber). This aspect of the technology will run in the background, without ever needing to directly interface with driver.”


However, Boyadjis believes in the long-term that the “possibilities for interior AI platforms are far reaching”.

Says Boyadjis, “As in the case of the NVIDIA Co-Pilot, these systems will become a partner for the user and they will be able to interact with the vehicle/service brand both inside and outside of the car itself.

“Toyota’s Yui, Volkswagen’s I.D., and the Faraday Future “FFID” are other great examples of what automakers are envisioning for the virtual personal mobility assistant.”

The AI Ecosystems of tomorrow: A peak into the future

But if you want a picture painted of how this smart, innovative technology could change all our lives, then let me leave you with this inspirational glimpse of the future from NVIDIA’s Danny Shapiro.

“Imagine your hum-drum commute now,” he begins. “Now let’s fast forward a few years to when you own (or subscribe to) a self-piloted automobile. You use an app or wearable device to order your car’s arrival at your front door. As it transports you from home to the office, you catch up on your emails, or watch the news on the vehicle’s integrated tab.

As you glance across the road at other cars, you notice that very few have someone at the steering wheel and most have no driver at all. Thanks to the information being shared by these cars, traffic management systems have dealt with congestion spots before they can affect your commute, so your journey is smooth and fast.

On arriving at the office, the car drops you off, then embarks on to its next programmed destination.” 

For NVIDIA ‘the direction of travel’ on this exciting journey is also clear: to turn drive level 5 autonomy into reality. Armed with a vision as powerful as it is liberating, few would bet against NVIDIA’s automotive team from succeeding. 

Over to you, NVIDIA…


Company information according to § 5 Telemediengesetz
IQPC Gesellschaft für Management Konferenzen mbH
Address: Friedrichstrasse 94, 10117 Berlin
Tel: 49 (0) 30 20 913 -274
Fax: 49 (0) 30 20 913 240
Registered at: Amtsgericht Charlottenburg, HRB 76720
VAT-Number: DE210454451
Management: Silke Klaudat, Richard A. Worden, Michael R. Worden

Firmeninformationen entsprechend § 5 Telemediengesetz
IQPC Gesellschaft für Management Konferenzen mbH
Adresse: Friedrichstrasse 94, 10117 Berlin
Telefonnummer: 030 20913 -274
Fax: 49 (0) 30 20 913 240
Email Adresse:
Registereintragungen: Amtsgericht Charlottenburg HRB 76720
Umsatzsteuer- Indentifikationsnummer DE210454451
Geschäftsführung: Silke Klaudat, Richard A. Worden, Michael R. Worden