NVIDIA Introduces VR-based Simulator for Autonomous Vehicle Testing
NVIDIA recently announced its plans to employ virtual reality technology for testing of autonomous cars at the GPU Technology Conference (GTC) in San Jose, California. The cloud-based system known as DRIVE Constellation Simulation System is an integration of NVIDIA’s powerful AI car computer (DRIVE Pegasus) and a virtual simulation software (DRIVE Sim Software) to accurately test and validate self-driving cars based on billions of driving miles.
“Deploying production self-driving cars require a solution for testing and validating on billions of driving miles to achieve the safety and reliability needed for customers,” said Rob Csongor, vice president and general manager of Automotive at NVIDIA. “With DRIVE Constellation, we’ve accomplished that by combining our expertise in visual computing and data centers. With virtual simulation, we can increase the robustness of our algorithms by testing on billions of miles of custom scenarios and rare corner cases, all in a fraction of the time and cost it would take to do so on physical roads.”

This new system according to NVIDIA will dramatically improve the testing capability considering different driving conditions. The system makes use of two different servers to create a photorealistic simulation and is capable to generate billions of miles of AV testing.
[see-also]
The simulation server is capable to generate a stream of simulated sensor data and feed it to the DRIVE Pegasus for processing. The driving commands from the DRIVE Pegasus are again fed back to the simulator, completing the digital feedback loop.
This cycle occurs 30 times per second, allowing the system to validate algorithms and ensure that the simulated vehicle is being operated correctly. The simulation software is capable to simulate a range of testing environments such as different weather, different times, and different types of road surfaces.
Shapiro explained that with this new VR-based simulator, it is possible to create a scenario of blinding sun 24 hours a day to efficiently train the vehicle, which is not possible in real-world conditions, where there are about 10 minutes to capture the data. Different levels of dangerous scenarios can also be scripted in the simulation test to understand and improve the autonomous car’s ability to react.
The announcement of the DRIVE Constellation product came in at a time when NVIDIA and its partner, Uber, put their AV testing on hold after the Tempe accident that took place a week ago.
NVIDIA CEO, Jensen Huang also addressed the accident issue after his keynote address at the GTC. Huang said that the accident was tragic, and this was a reason why they are serious about developing technologies like DRIVE Constellation to save lives. He also mentioned that the accident will lead to more investments in developing systems similar to their VR-based simulator.
This comprehensive solution is expected to be made available in the third quarter of 2018. It will allow NVIDIA customers like Google and Uber to improve their autonomous vehicle systems before putting it to the real-world test.
"We could not have asked for more from InSight," Anna Harleston, co-lead of NASA InSight's Marsquake Service told IE.