Tesla's Autopilot Can Be Tricked in Just a Split Second Through This Method

Fully autonomous cars have a way to go yet, it seems.
Fabienne Lang
Tesla Autopilot stopped when it saw the billboardCyber Security Labs @ Ben Gurion University/YouTube

Tesla's autopilot is one of the electric vehicle maker's selling points. However, it has faced some resistance at times, and it may be about to do so again. 

A team of researchers from Israel's Ben Gurion University of the Negev has successfully tried out a trick that managed to fool Tesla's autopilot. When the researchers flashed split-second "phantom" images of speed limits or stop signs in regular ads, the system picked up on them and followed the road signs. 

This shows how perceptive Tesla's autopilot is, but it also demonstrates how this could create complete havoc on the main road, for example. 

Wired was the first to report on the news.

SEE ALSO: WHITE HAT HACKER GAINS ACCESS TO TESLA'S SERVERS AND ITS ENTIRE FLEET

'Split-second phantom attacks'

The team investigated "split-second phantom attacks," which saw a Tesla Model X's autopilot and Mobileye 630 consider a "depthless object" that only flashes for a few milliseconds on a billboard, as per the team.

All that's needed is a remote hacker attack that embeds phantom images into internet-connected roadside billboards and signs, and the damage is done. 

In the research, the team tested its theory by embedding a stop sign image into an ad, which saw the Tesla automatically stop in the middle of the road. Another test saw the Tesla follow a speed limit that had been flashed onto another billboard, which it also followed. 

What's worrying is that "The driver won't even notice at all. So somebody's car will just react, and they won't understand why," explained Yisroel Mirsky to Wired, a researcher for Ben Gurion University and Georgia Tech who worked on the research.

All it takes, according to the research, is for a phantom image to appear for 0.42 seconds for Tesla's autopilot to follow its rule. A Mobileye device, however, required just 0.8 seconds to do so.

It is possible to hack billboards without leaving any traces behind

What's truly worrying about this research is that it's possible to hack into these billboards and change the signs without leaving behind any traces, says the study

When asked to comment on the research, Tesla responded in its regular way: It hasn't built autopilot for Teslas to drive entirely autonomously, per Wired. However, the Ben Gurion research team isn't convinced, as they responded in an email to Tesla, and stated, "As we know, people use this feature as an autopilot and do not keep 100 percent attention on the road while using it."

There is clear evidence that some Tesla owners are using the autopilot system as an autonomous driving system, and some of the results have been disastrous. Just take a look at this Tesla that crashed into a police car while the owner was watching a movie, or this drunk 'driver' who decided to sit in the passenger seat so he could keep drinking while his Tesla autopilot took charge or this driver who fell asleep as his Tesla drove at 93 mph (150 kmh)