Sometimes, new technologies have a shaky start.
And the United States government just began a formal probe into Tesla's Autopilot system, which is a partially autonomous system, in the wake of several collisions with parked emergency vehicles, according to a posting on the US Department of Transportation's National Highway Traffic Safety Administration's (NHTSA's) website.
Some would say this was a long time coming. And it could lead to fundamental changes in how autonomous systems are regulated for the next generation of advanced cars.
Tesla's Autopilot is a partially autonomous system
The investigation involves 765,000 Telsa vehicles, which basically accounts for everything the company has sold in the U.S. since the beginning of its 2014 model year, according to a press release on TechXplore. Seventeen people were injured and one killed in the crashes tracked by the NHTSA during the probe. The government agency also noted 11 crashes since 2018 that involved Teslas set on Autopilot or Traffic Aware Cruise Control impacting vehicles in first responder-environments, where there were flashing lights, flares, deployed hazard cones, or illuminated arrow boards.
The new probe comes as further proof that the Biden administration's NGTSA is assuming a more serious stance on the development and deployment of automated vehicle safety than was seen during earlier administrations. In the past, the agency hesitated on regulating the nascent technology, to avoid public adoption of similar autonomous systems, which may evolve into life-saving features of next-gen vehicles. But now, with the investigation probing the company's entire lineup, including the Models S, X, Y, and 3 from the 2014 through 2021 years, it seems safety has gained primary urgency.
Advocacy group pushes for safeguards to limit the use of electronic driving systems
As of writing, the National Transportation Safety Board recommended that NHTSA and Tesla limit the use of Autopilot to regions where there is no danger of impacts. The NTSB also suggested that Tesla should be required to enhance its systems to ensure drivers are fully aware of their surroundings and ready to assume control at any moment. But so far, the NHTSA hasn't made these suggested measures official.
In 2020, the NTSB said Tesla, its drivers, and even relaxed regulations from the NHTSA were responsible for two collisions involving the vehicles crashing underneath crossing tractor-trailers. The former agency argued that the NHTSA neglected to require automakers to establish safeguards to limit the use of electronic driving systems. The NHTSA made its decision on the matter after probing a 2019 crash in Delray Beach, Florida, where a 50-year-old Tesla Model 3 driver died. This incident saw a Tesla on Autopilot fail to brake or attempt to dodge the tractor-trailer crossing its trajectory.
"We are glad to see NHTSA finally acknowledge our long standing call to investigate Tesla for putting technology on the road that will be foreseeably misused in a way that is leading to crashes, injuries, and deaths," said Executive Director Jason Levine of the nonprofit Center for Auto Safety, an advocacy group, in the TechXplore release. "If anything, this probe needs to go far beyond crashes involving first responder vehicles because the danger is to all drivers, passengers, and pedestrians when Autopilot is engaged." And he may be right. The semi-autonomous system has been misused extensively by Tesla drivers, who have been caught not only drunk driving (in Norway, for example), but even sitting in the backseat of the car while careening down a California highway. If we can't physically force or rationally convince people to never drink and drive (or sit in the back) after activating their Tesla's Autopilot, then the only answer may be to enforce stricter regulations and technological means of checking the awareness and state of drivers.