Uber's Self-Driving Car in Fatal Crash Had Emergency Braking Disabled

Recent reports indicated that the car registered the presence of the pedestrian 6 seconds before she died on impact, but emergency brakes were disabled on the car.
Shelby Rogers
An autonomous Uber Volvo XC90 in San Francisco, similar to the one involved in the accident.DIlu / Wikimedia Creative Commons

The Uber self-driving car involved in the death of a woman in March was found to have its emergency braking systems intentionally disabled, according to reports.

The car detected the woman walking in front of it as quickly as 6 seconds prior to the crash. However, it didn't stop due to the intentional brake adjustment. 

Uber explained why the emergency braking systems were not enabled at the time of the accident. 

"...emergency braking maneuvers are not enabled while the vehicle is under computer control, to reduce the potential for erratic vehicle behavior," or to provide a more enjoyable and smoother ride for passengers. "The vehicle operator is relied on to intervene and take action. The system is not designed to alert the operator."

It's also important to note that the model of Volvo involved in the accident also had its collision avoidance and emergency braking systems disabled while in the autonomous mode. The disabling of such important elements has raised a number of questions as to the logic behind the decision and whether the emergency braking system would even be of use in a situation where it wasn't actually deployed in an emergency.

Here was the official statement from Uber in response to the NTSB's report: 

"Over the course of the last two months, we’ve worked closely with the NTSB. As their investigation continues, we’ve initiated our own safety review of our self-driving vehicles program. We’ve also brought on former NTSB Chair Christopher Hart to advise us on our overall safety culture, and we look forward to sharing more on the changes we’ll make in the coming weeks."

Investigators have concluded that the car registered this woman's presence 6 seconds before impact. In theory, that would've given the car traveling at 43 mph enough time to slow down considerably with the brakes. The NTSB report did note however that the system didn't register the woman as a human. The software classified her as an unknown object and then shortly after as a bicycle. 

Most Popular

Approximately 1.3 seconds later, the car's autonomous system decided that emergency brakes were needed. Dashcam footage shows that the driver had been distracted by something within the console, and she reacted with less than a second to brake.

"In a postcrash interview with NTSB investigators, the vehicle operator stated that she had been monitoring the self-driving system interface," the report said. "The operator further stated that although her personal and business phones were in the vehicle, neither was in use until after the crash, when she called 911."

[see-also]

The findings from the National Transportation Safety Board have sparked even more discussions as to the role of autonomous cars on the road and the responsibilities of companies in these situations. 

From an engineering perspective, this has frustrated developers of self-driving cars. No piece of technology will perform perfectly 100 percent of the time, and most makers of autonomous cars tell drivers to keep their eyes on the road at all times -- ready to grab the wheel if needed.

"We know that drivers, that humans in general, are terrible overseers of highly automated systems," said Bryan Reimer, a engineer who studies human-machine interaction at MIT. "We’re terrible supervisors. The aviation industry, the nuclear power industry, the rail industry have shown this for decades."