Come on Inside, NVIDIA’s Shapiro Offers

Automakers and suppliers around the world lean on NVIDIA’s Drive Xavier and Drive Pegasus supercomputing for their AV programs. With Drive Constellation, they now have access to NVIDIA’s muscle inside the lab.

James M. Amend, Senior Editor

July 31, 2018

3 Min Read
Simulation uncovers tricky corner cases, NVIDIA’s Shapiro says.
Simulation uncovers tricky corner cases, NVIDIA’s Shapiro says.Roger Hart

TRAVERSE CITY, MI – Chip-making giant and artificial-intelligence expert NVIDIA is taking its autonomous-vehicle testing off the road and into the data lab.

NVIDIA, which works with a whopping 370 automotive companies on automated driving and other AI-driven technologies, still performs real-life testing on closed courses. But with the release of its cloud-based Drive Constellation simulation system it can run millions of miles more data for testing and validation inside the lab and in a fraction of the time it takes on the road.

Automakers and suppliers around the world lean on NVIDIA’s Drive Xavier and Drive Pegasus supercomputing for their AV programs. With Drive Constellation, they now have access to NVIDIA’s muscle inside the lab.

“We need to ensure these systems are safe. We need to ensure they can handle all the different situations that arise,” says Danny Shapiro, senior director-Automotive, at NVIDIA.

“The challenge is that as we are doing these (data) collection drives, you usually don’t see much that’s of use,” he tells a session on connected and automated vehicles at the CAR Management Briefing Seminars here.

The trick, Shapiro says, is handling the corner cases: A child running out in front of the car, another driver running a red light, severe weather or blinding sun in the evening. As many miles as the industry has logged, those cases cannot be captured.

“You need billions of miles of test drives to actually show that (an AV) is safer than a human,” he says. “Simulation now is the key. You can test whether they can handle these different situations.”

NVIDIA, in short, has created an AV simulator. A server chock-full of GPUs generates the graphics and simulates the lidar and radar signals, as well as the camera images, an AV would need to create a virtual world with additional input from maps and other real-world data.

The server then is connected to Drive Pegasus, the drive process is created and the deep neural networks at the heart of NVIDIA's computing program do their work. The information is sent back to the simulator.

Simulation has another advantage. Before placing their car in the virtual world, automakers and suppliers must configure the AV, a process that allows them to determine where to place a camera and even how many they might need. They can drag and drop sensors wherever.

“And we can run all kinds of tests to determine the optimal configuration,” Shapiro says. “But the key thing is we can create these hazardous situations and test them over and over again. You only get a couple minutes a day to test how sensors might work at sunset.”

NVIDIA also is using AI in its research to create scenes, changing the car or the road surface with the click of a mouse. Trees and buildings could be painted in, or removed, using AI.

“Again, leveraging the power of simulation and virtual reality, we are able to create a whole range of different types of environments,” he says.

Drive Constellation will be available in third quarter.

Shapiro also shows some emerging AI technology for use inside the car able to detect distracted driving and drowsy driving, as well as the capability to detect an approaching cyclist and hold the door shut so it is not opened into the cyclist’s path.

The technology also could detect another driver about to run a red light and momentarily disable the throttle so another driver does not cross its path.

Read more about:

NVIDIA

About the Author

You May Also Like