How Do Self-Driving Cars Work?

Why aren't fully autonomous cars here yet, and how do they even work?

Where are all the self-driving cars at? That's what you're probably saying to yourself after many large tech and automotive companies forecasted that by next year, in 2020, fully autonomous tech would be rolled out in many automobile fleets. 

While that "deadline" looks like it won't be being met, self-driving and autonomous technologies have made significant strides in the last several years. Just recently, an autonomous semi-truck completed a trip across the U.S. with no issue.

Tesla's autopilot system has by far been the highlight of self-driving tech, and it has been in the spotlight since the beginning. Tesla has the first-mover advantage, having reinvented how an automobile company is structured and functions. In the last year, Tesla's autopilot system has clocked over 2 billion miles of use.

That's a significant amount of miles, with very few accidents, compared to human drivers. 

With the technology still advancing, perhaps still in its infancy, what is self-driving technology, and how do cars equipped with it work?

What are self-driving cars?

The terms self-driving and autonomous are used fairly interchangeably, and they essentially are. Autonomous is more general, whereas self-driving only relates to vehicles. In the case of cars though, those technicalities don't matter.

Self-driving cars rely on hardware and software to drive down the road without user input. The hardware collects the data; the software organizes and compiles it. On the software side, the input data will normally be processed through machine learning algorithms or complex lines of code that have been trained in real-world scenarios. It's this machine learning technology that is at the center of self-driving technology.

As more and more data is processed through autonomous self-driving algorithms, they only get better and better—smarter and smarter. Machine learning algorithms can essentially teach themselves how to function, assuming they've been given the right constraints and goals.

Autonomous vehicle levels

When we think of autonomous or self-driving vehicles, we probably think of a car or semi that can drive itself completely without a human. While this is autonomous, it doesn't tell the whole story. That "fully autonomous" scenario represents a level 5 autonomous vehicle, levels 0 through 5 represent the full spectrum of driving, from fully human, to 5, fully computer.

Take a look at the helpful infographic below to visualize these 5 different levels of automation. 

How Do Self-Driving Cars Work?
Source: The Simple Dollar

To explain each detail in more concrete text, we've laid them all out below.

Advertisement

Level 0: The driver completely controls the vehicle at all times.

Level 1: Individual vehicle controls are automated, such as electronic stability control or automatic braking.

Level 2: At least two controls can be automated in unison, such as adaptive cruise control in combination with lane-keeping.

Level 3: 75% automation. The driver can fully cede control of all safety-critical functions in certain conditions. The car senses when conditions require the driver to retake control and provides a "sufficiently comfortable transition time" for the driver to do so.

Level 4: The vehicle performs all safety-critical functions for the entire trip, with the driver not expected to control the vehicle at any time. 

Level 5: The vehicle includes humans only as passengers, no human interaction is needed or possible.

 RELATED: UBER PUTS SELF-DRIVING CARS BACK TO WORK  - BUT WITH HUMAN DRIVERS

Advertisement

What technologies are inside self-driving cars?

Self-driving cars include a significant amount of technology in them. The hardware inside these cars has stayed fairly consistent, but the software behind the cars is constantly changing and being updated. Looking at some of the primary technologies, we have:

Cameras

Elon Musk has famously claimed that cameras are the only sensor technology needed for self-driving cars, we just need the algorithms to be able to fully comprehend the images they receive. Camera images capture everything needed for a car to drive, it's just that we're still developing new ways for computers to process the visual data and translate it into 3D actionable data.

Teslas have 8 external-facing cameras to help them understand the world around them.

Radar 

Radar is one of the primary means that self-driving cars utilize to "see" along with LiDar, and computer imagery and cameras. Radar is the lowest resolution of the three, but it can see through adverse weather conditions, unlike LiDAR, which is light-based. Radar, on the other hand, is radio wave-based, meaning that it can propagate through things like rain or snow.

Advertisement

LiDAR

LiDAR sensors are what you'll see on top of self-driving cars spinning around. These sensors shoot out light and use the feedback to generate a highly-detailed 3D map of its surrounding area.

LiDAR is very high resolution, compared to RADAR, but as we mentioned above, it has limitations in low-visibility weather due to it being light-based. 

Other sensors

Self-driving cars will also utilize traditional GPS tracking, along with ultrasonic sensors and inertial sensors to gain a full picture of what the car is doing as well as what's occurring around it. In the realm of machine learning and self-driving technology, the more data collected, the better.

Computer Power

All self-driving cars, and essentially all modern cars, require a computer on-board to process everything happening with the vehicle in real-time.

Advertisement

Self-driving cars require extreme processing power, so rather than traditional CPUs, they utilize graphical processing units, or GPUs, to do their calculation. However, even the best GPUs have started to prove insufficient for the needs of the extreme data processing seen in self-driving vehicles, so Tesla has introduced a neural network accelerator chip, or NNA. These NNAs have extreme processing power in real-time, capable of handling real-time image processing.

For a perspective between CPUs, GPUs, and NNAs, this is how many giga operations per second they can handle, or GOPS:

  • CPU: 1.5
  • GPU: 17
  • NNA: 2100

NNAs are the clear winner, by many many times. 

The future of autonomous and self-driving vehicles

Roughly 93% of all car accidents are due to human error. While much of society is resistant to the idea of self-driving cars, the simple fact of the matter is that they're already safer than human drivers. Self-driving vehicles, when fully tested and built out, have the potential to revolutionize our travel infrastructure.

Advertisement

It will still be some time before we see level 5 autonomy implemented in cars on the road, but for now, level 2 is reaching commonplace in modern automobiles. The next levels will be upon us soon. 

If you want to see some of what we discussed in this article and more in visual, animated, infographic form, take a look at the infographic from The Simple Dollar below.

animated infographic about self-driving cars
Source: The Simple Dollar
Advertisement

Stay on top of the latest engineering news

Just enter your email and we’ll take care of the rest: