It’s really hard to blindly trust a computer to get you from point A to point B. It’s hard enough in an Uber with a human driver. We all want to stay safe and do what’s best for us. This has led to a lot of conversation around Tesla’s Autopilot program.
In this review, we’ll look at some data and statistics. I’ll explain everything about Tesla’s Autopilot before coming to an ultimate verdict as we explore just how safe it is. If you or a friend own a Tesla or are considering it, this is a must-read.
Click Here For The Short Answer
What is Autopilot?
Autopilot is one of the many features offered when you buy a new Tesla. If you’re not familiar, Tesla is leading the pack when it comes to electric vehicles (EVs), and its owner Elon Musk is pushing hard for self-driving cars.
I understand that there are pros and cons of self-driving cars, but it’s hard to overlook how impactful this would be to the world. As of right now, Autopilot is the closest thing you’ll find on a commercially available vehicle.
Also, for reference, I’ll be capitalizing “Autopilot” throughout this article. It’s the brand name that Tesla has developed specifically for their vehicles.
Autopilot is comprised of a number of sensors, mechanical components, computers, and algorithms working in real time. They’re all operating within your Tesla as you drive down the freeway.
The only difference is that you aren’t driving. The car is. With Autopilot enabled, you can just sit there and look pretty while the computers do all the work.
Tesla is still requiring a driver to sit in the seat and interact with the car while it’s moving. This means that you can’t kick your feet up in the back seat and take a nap during your commute… yet.
That means that the cars are semi-autonomous. There are plenty of self-driving functions, but the car isn’t wholly self-driving. It needs some human interaction from time to time. But, then again, don’t we all?
How Does Tesla’s Autopilot Work?
If you look at Tesla’s page about Autopilot, you’ll realize how many moving parts there are. The simplest explanation is that this program aims to drive the car on your behalf.
The software itself is quite complex. Let’s break it down and look at some of the components on their own.
The Algorithm
The algorithm is the brain of the operation. It’s the bread and butter of how Autopilot works.
An algorithm is a really complicated set of computer code instructions. The purpose of Autopilot is to make real-time decisions based on the information that it receives.
The computer will receive a bunch of inputs as it drives. It will think about what to do, then send outputs to perform the actions. All of this could be happening quicker than you blink your eye.
The whole system is only as good as the algorithm. Since self-driving cars are uncharted territory, it takes a lot of practice, tests, and trials to refine the code.
So how does a complicated algorithm keep getting smarter? It uses cutting-edge AI learning, neural networking, and some incredible hardware. All of it is way too complicated for us to understand, so I’ll leave it to the geniuses at Tesla to explain it.
The program keeps learning as you continue to drive. The longer you operate your vehicle in Autopilot, the smarter it becomes.
Navigation
Unless you want to wind up at a random place every time, you need a strong navigation system. Tesla’s fleet comes equipped with some pretty powerful navigation that can map your route and perform the necessary turns.
Steering
Steering is where the mechanical world meets the electrical world. As the algorithm decides what to do, it will automatically move around your steering wheel.
Since the wheel isn’t operated by a human, there’s no ability for human error. The car will know exactly how much to turn the wheel in order to get the perfect radius for whatever the car wants to do.
Sensors, Radars, and Cameras — Oh, My!
A standard Model 3 comes with 8 cameras, 1 radar, and 12 ultrasonic sensors. That’s a ton of tech.
These are the little guys that do all the surveillance. They’re scanning the environment around the car to ensure the computer inside has enough information to make decisions.
It’s very important that all of the sensors are working correctly. If one goes out, you’re left with a blind spot that can lead to an accident.
These devices are constantly on the lookout. They’re detecting people, other motorists, animals, stop signs, speed limit signs, road lines, and plenty more. They have to quickly compile the data and send it over to the onboard computer so the algorithm can process it.
The sensors map a circumference of about 15 feet around the car. If something goes wrong within the circumference, it will quickly correct the car’s placement to avoid the problem.
The Pedals
It might seem obvious, but the gas and brake pedals are also fully automated. A Tesla can finely adjust the acceleration or braking power to a level that no human could dream of. It means that the inputs are more precise, and efficient, and they’re much faster. Move over, Lewis Hamilton!
Traffic-Aware Cruise Control
What Tesla coins their “traffic-aware cruise control” is the perfect example of how all these components play together in an easy-to-understand package.
It works like your car’s cruise control but on steroids. You still operate the steering wheel, but the sensors will match your speed to the person in front of you.
You can even have the car match the flow of traffic so you don’t have to worry about looking at your speedometer every minute. It falls under the Tesla Autopilot umbrella.
Summon and Parking
Among the fully self-driving capabilities are the summoning and parking features. The car will come to you when you summon it, and it will park itself if you request it.
Both of these functions can be done with no interaction whatsoever from a human. You don’t even need to be in the car for it to happen.
What if you’re on the other side of the parking lot and want your car to come? It will navigate through the obstacles and make its way to you safely. Talk about door-to-door service.
How Safe is Tesla’s Autopilot?
Given all the information I just covered, how safe is the Autopilot system? So far, it’s a lot safer than driving without it.
I’ll talk numbers in just a second, but just on a theoretical level, it makes perfect sense. A robot will do a better job than a human when it comes to precision, reaction time, logical thinking, and obeying rules. All of these features are pillars when it comes to driving.
A good-enough algorithm would sufficiently beat out a human driver every day of the week. It’s like comparing the use of a calculator to doing mental math.
Removing the Human Element
When you remove the human element when it comes to driving, you’re taking away the emotion and human error that accompany it.
This might seem like a dark thought but think about it for a second. Have you ever been cut off in traffic and gotten immediately angry? Maybe you started tailgating the dumb driver, going a little too fast, or taking some risks you shouldn’t have.
If a computer gets cut off, they have no emotional reaction at all. If you slap your computer or phone right now, they’re not going to get mad at you and start getting back at you somehow.
Additionally, human error is the leading cause of accidents in the world. Distracted driving, hitting the wrong pedal on accident, not seeing someone — all of these factors are due to us being humans.
One counter-argument would be that humans are still superior in driving decision-making. However, in this day and age where many people are glued to their smartphones, I see far too many drivers distracted by phones.
Whether it’s texting, TikTok, Instagram, Snapchat, Facetime or even watching movies, you’ve probably seen it before. The light turns green and a honk is required to get the person in the front going due to them being distracted by their phone. In my opinion, it’ll only get worse.
If instead, you have Autopilot do all the thinking and driving, they won’t make mistakes as you or I would. There are plenty of accidents that happen because people hit the gas pedal instead of the brake pedal by mistake.
Processing Speed
The other thing that makes Autopilot safer on paper is how quickly the car can process information. The car will scan the road, find an obstruction to its path, and correct the course quicker than a human can.
A computer’s processing speed is exponentially quicker than ours. Especially if the driver is busy on their phone, playing with the radio, or deep in thought.
When it comes to safe driving, reaction time is essential. Autopilot will always win this race.
Safety Data from Tesla
Tesla is all about data. They’ve been monitoring as people use Autopilot, and they have had some interesting results so far.
How Often Do the Cars Crash?
Through the first three months of 2021, Tesla reported that there was an average of one accident per 4.2 million miles spent on Autopilot.
Without Autopilot on, there was an accident every 2 million miles driven.
This second piece of data is interesting. The driver still has all the preconfigured safety features engaged, they’re just manually driving instead of using Autopilot. The semi-autonomous setting reduces the frequency of crashing a full 100%.
What happens when the drivers disengage all of the safety features? They average a crash every 978 thousand miles.
Again, this value is about half of the previous value. It shows that the safety features really work, and Autopilot is statistically safer to use.
Previous Crash History
Since Autopilot is based on such a complex algorithm, it makes sense that there was a learning curve. In 2020, Tesla reported an accident every 3.45 million miles with Autopilot engaged.
The same category in 2019 showed an accident every 2.87 million miles.
Are you seeing the same trend that I am? Every year, Tesla’s Autopilot becomes noticeably safer. This is all thanks to the complexity of Autopilot and the importance of test data. Who knows how much safer these cars will be in the next 2 years?
Are There Safety Concerns with Autopilot?
I’d be remiss if I’d left out some of the safety concerns. To give you a better picture of Autopilot, let’s discuss them now.
Illegal Passes
There are a lot of reports that Autopilot tries to send the driver into an illegal pass. This could mean passing with not enough space, in an area where passing is illegal, or through a “stay in your lane” section of construction.
This was an especially big issue in 2019, but there have been far fewer reports of it recently. The good news is that you can quickly correct your car if it tries to initiate an illegal pass. This is why it’s so important to stay attentive, even in Autopilot.
Other Drivers
The biggest safety concern on the road is other drivers. Autopilot will do its best to predict what other drivers will do, but there are some actions they just can’t prepare you for.
A driver in another car stomping on the gas pedal as you try to merge in front of them is a perfect example of this idea. Equally, a driver trying to cross the road in front of you without ample space could result in your car hitting theirs.
Yet again, this is a category that improves over time. As Autopilot gathers more data, your car will be more prepared to deal with the antics of other drivers.
Misusing Autopilot
Some people decide that they want to misuse Autopilot. This will clearly pose a huge risk to the driver’s safety.
For instance, there was an incident in 2018 where a driver was playing a game on their phone and their car crashed, killing them. Tesla is very intentional with its message: Autopilot is not fully self-driving yet. It still needs the driver to be attentive.
Lots of Beta Programs
As you read the sheet of features on your newest Tesla, you’ll see that a lot are in the Beta stage. This means that the program is still being debugged and perfected.
As you use it, data from your trip is sent to the mothership so the program can get better.
Beta typically means it’s just a matter of time before the feature is perfected and usable. Heck, the self-parking feature was in Beta not too long ago — look at it now.
Tesla drivers just need to respect the “Beta” tag. Take it with a grain of rice and make sure you’re ready to take over and correct if anything goes wrong.
State-Level Laws
Finally, there are some safety concerns on a state level. There are plenty of state-wide laws that don’t exist in neighboring states.
This makes it tough for a Tesla. Without a massive legal database, how is the car supposed to know that this action is illegal in Montana, but legal in Idaho?
For instance, on Delaware’s highways, you can drive in any lane you wish. In New Jersey, the law is that the leftmost lane is strictly for passing slower cars.
If a Tesla takes the left lane in NJ without the intention of passing another car, it’s perfectly okay for a cop to pull them over and give them a ticket.
Additionally, some laws state that it’s illegal to pass a car on the right. There is a whole slew of state-level laws that vary as you cross an invisible border.
The Verdict: Is Tesla’s Autopilot Safe?
After looking over everything presented, doing some extra digging, and really thinking about it – let’s make a final verdict. Is Tesla’s Autopilot program safe for the “everyday driver”? Yes, it absolutely is.
Statistically, it’s orders of magnitude safer than driving a standard Toyota Camry down the road and miles ahead of the rest of the competition. Tesla’s team is doing an amazing job of putting together a robust semi-autonomous driving package.
I’d have no fear of jumping into a Tesla and hitting Autopilot every day of the week. There is overwhelming evidence that it’s a very safe and viable option. Looking at the fatal accidents that do happen, it seems that a lot of them are due to a human error of the driver or other motorists.
Conclusion
In conclusion, I think that Tesla’s Autopilot program is safe and will be getting better and better as more auto-pilot vehicles will be on the road. If you think so too or have some counterarguments, I’d love to hear from you. Leave a comment, share with your friends, or send us a message to continue the conversation.
As always, you can see more news, guides, and information on the rest of this blog. Make sure you grab the best accessories and tools for the everyday car owner.
I am a computer-design engineer and have written much software that does pattern recognition.
I have driven a Tesla autopilot in the controlled environment of an interstate highway spur, I-540, near the Raleigh-Durham airport. It worked very well. However, the environment was modest traffic on a weekday afternoon, all cars going in the same direction at approximately the same speed. There were no intersections, traffic lights, pedestrians, bicyclists, stop signs, crosswalks, etc., i.e., none of the normal road features that introduce chaos.
I am a daily bicyclist and pedestrian in Chapel Hill, NC. Here is a situation that occurs very frequently: I must cross in front of a stopped car that is waiting for a break in the traffic on his/her left so he can make a right turn. I must be 100 per cent certain that the driver knows I am there before I walk or ride in front of him. I wave, he waves back, and I go.
How will any autopilot software handle this? Put a signboard of the front of the car that reads: Mr. bicyclist or Mr. pedestrian, I see you there, and I will not move until after you cross in front of me. Nothing I have read or seen videos about indicates that Tesla is solving this problem.
Your reactions please.
That’s a valid critique. At the current moment, Tesla’s autopilot is entirely comprised of visual ultrasonic cameras perceiving someone is walking and the vehicle must recognize visually that someone is there. With Lidar not in any Tesla yet and possibly never will be, this will certainly be a challenge for Tesla to tackle. Currently, I’m sure you’ve already heard of Tesla’s smart summoning mode. That mode wouldn’t work at all if the vehicle wouldn’t be able to perceive pedestrians walking or on a bike. It’s not entirely there yet, we can all agree on that especially it being at level 2. However, I think that it’s a task that can be solved especially with more automation coming in the future.
I find sometimes I turn off autopilot on a Model Y, going into Cruise Control. Thus, the car somewhat appears to still be in autopilot but does react to curves. I wonder how many of these Tesla incidents are the result of this. I wish there was a better indicator on the control screen other than blue lines.
That’s an interesting observation. I know that there are settings that can be adjusted when in ACC (adaptive cruise control) mode. As far as having another indicator to let the driver know they’re in ACC mode vs autopilot, or as least be more visible on the info display, I’m sure that would be very helpful. Perhaps with a new software update, they’ll make a clearer distinction.