We asked 5,000 Model 3 owners about Tesla’s software for automated driving on highways and parking lots. More than 90% said the feature makes them safer.
In Part III of the Bloomberg Survey, 90% of Model 3 drivers say Tesla’s software for automated driving makes them safer—even if it does occasionally screw up.
It seemed, for a terrifying moment, that Tesla’s automated-driving software had made an error at highway speeds. A driver from Florida reported an experience of inexplicable braking by the Autopilot feature on his Model 3. An instant later, the vehicle ahead swerved out of the lane to reveal a stopped car. Tesla’s sensors had detected the upcoming hazard and acted without human input to avert a crash.
A Model 3 driver from Alabama had a very different experience. He was cruising along the highway with a state trooper following directly behind. There was nothing obstructing the road ahead, but finicky Autopilot sensors triggered the brakes. Only human reflexes prevented a rear-end encounter with a cop, after the Model 3 driver jammed his foot on the accelerator to override Autopilot.
Those are just two examples out of more than 1,600 close calls involving Autopilot software shared in the latest installment of Bloomberg’s customer survey. We asked 5,000 Model 3 owners about their experience with the electric sedan that Tesla Chief Executive Officer Elon Musk says will lead the world into a new era of driverless transportation.
These Autopilot stories illustrate the messy middle ground in which the automotive world now finds itself. Ever-vigilant vehicles running automated-driving technology can perform superhuman maneuvers to keep drivers safe—and can also fail in decidedly sub-human ways. Close supervision is needed at all times, which is easy to forget when Autopilot is able to drive for long stretches without intervention.
It seemed, for a terrifying moment, that Tesla’s automated-driving software had made an error at highway speeds. A driver from Florida reported an experience of inexplicable braking by the Autopilot feature on his Model 3. An instant later, the vehicle ahead swerved out of the lane to reveal a stopped car. Tesla’s sensors had detected the upcoming hazard and acted without human input to avert a crash.
A Model 3 driver from Alabama had a very different experience. He was cruising along the highway with a state trooper following directly behind. There was nothing obstructing the road ahead, but finicky Autopilot sensors triggered the brakes. Only human reflexes prevented a rear-end encounter with a cop, after the Model 3 driver jammed his foot on the accelerator to override Autopilot.
Those are just two examples out of more than 1,600 close calls involving Autopilot software shared in the latest installment of Bloomberg’s customer survey. We asked 5,000 Model 3 owners about their experience with the electric sedan that Tesla Chief Executive Officer Elon Musk says will lead the world into a new era of driverless transportation.
These Autopilot stories illustrate the messy middle ground in which the automotive world now finds itself. Ever-vigilant vehicles running automated-driving technology can perform superhuman maneuvers to keep drivers safe—and can also fail in decidedly sub-human ways. Close supervision is needed at all times, which is easy to forget when Autopilot is able to drive for long stretches without intervention.
Six drivers claimed that Autopilot actually contributed to a collision, while nine people in the Bloomberg survey went so far as to credit the system with saving their lives. Hundreds of owners recalled dangerous behaviors, such as phantom braking, veering or failing to stop for a road hazard. But even those who reported shortcomings gave Autopilot high overall ratings.
More than 90% of owners said driving with Autopilot makes them safer—including most of the respondents who simultaneously faulted the software for creating dangerous situations.
Tesla currently offers two packages of Autopilot features. The basic version comes standard and includes automatic in-lane steering and advanced cruise control. For a $7,000 upgrade to what Tesla calls “Full Self-Driving,” a Model 3 owner gets frequent software updates that Tesla promises will eventually allow the car to drive itself. The most it can do today is “Navigate on Autopilot,” a feature that can take over most aspects of driving on highways: changing lanes, passing slower cars, moving from one highway to another, and steering onto the exit ramp.
Autopilot is not the only approach to self driving. Waymo, a subsidiary of Google’s parent company, Alphabet Inc., believes that Tesla’s gradual transition to self-driving features could leave drivers dangerously complacent in their oversight of the system. For more than a decade, Waymo has been taking an all-or-nothing approach: either cars drive themselves or humans do, with nothing in-between.
Autopilot couldn’t be more different. Tesla’s neural-net computers require training with massive amounts of real-world data, which Tesla gathers from its customers. All new Tesla cars come equipped with some version of Autopilot, and drivers collectively have logged some 2 billion miles under its direction.
When Tesla adds a new element to Autopilot, it starts by deploying to a small group of “early access” customers. Data recorded from their cars is then used to refine the feature, access is expanded, and more refinements follow. New features that seem crude and unreliable at first can be transformed within weeks.
Autopilot mistakes don’t look like human mistakes. When a Tesla is in the news after crashing into a parked car, for example, it’s easy to assume that the system is broken. But for every story like that, there are others in which Autopilot bounces radar signals beneath or around cars to sense danger, or swerves to avoid a truck merging into a blind spot.
Tesla owners in Bloomberg’s survey for the most part side with Musk’s approach, even when Autopilot occasionally screws up. We asked owners to describe times when Autopilot either put them in a dangerous situation or avoided what might have otherwise been unavoidable danger.
“In the middle of the Autobahn while driving at 130 km/h the autopilot suddenly decided the top speed was 50 km/h.”
Driver from Europe who rates Autopilot's overall safety ★★★★★
“A deer jumped in front of me on a dark road at night. By the time my foot moved to the brake pedal, it was already pressed to the floor!”
Driver from Colorado who rates Autopilot's overall safety ★★★★☆
“Whiteout conditions. Lake effect snow in Cleveland. Streets were extremely icy. A crossing car ran a red light going 45 mph at a blind intersection obscured by trees and the Model 3 automatically stopped before I could react. I missed a driver side collision, potentially fatal, by less than a car length.”
Driver from Ohio who rates Autopilot's overall safety ★★★★★
“During one of its automatic lane changes into a lane behind a semi, it SLAMMED on my brakes for no reason. The cars behind me managed to avoid rear ending me.”
Driver from California who rates Autopilot's overall safety ★★★★★
“It seemed to make risky choices whenever an unusual situation arises, like a missing lane line or a truck merging suddenly into your lane.”
Driver from California who rates Autopilot's overall safety ★☆☆☆☆
“Car entered my lane and I did not notice. Autopilot took over and alerted me quickly. It maneuvered out of the way and saved us from a wreck going 80 mph.”
Driver from Missouri who rates Autopilot's overall safety ★★★★★
“Didn’t recognize lanes properly on a two way road and put me in into oncoming traffic.”
Driver from California who rates Autopilot's overall safety ★★★★☆
“I wasn’t paying attention… Someone pulled in front of me and the car ran right into them.”
Driver from Pennsylvania who rates Autopilot's overall safety ★★★☆☆
This summer Tesla released “Smart Summon,” one of Autopilot’s most ambitious features yet. Using the Tesla app on a smartphone, drivers outside a vehicle can “summon” their cars to their location through a busy parking lots. With no driver behind the wheel, the car chooses its route and navigates around other vehicles and pedestrians. The owner can stop the car at any time by releasing their finger from the phone.
As soon as the feature came out, videos flooded social media with empty Teslas awkwardly traversing the parking lots of America’s Walmarts and Costcos. A week later, Bloomberg sent out a follow-up survey to 1,732 Model 3 owners with the Full Self-Driving option to get some initial impressions. One owner described the summon feature as driving like “nervous teenager with a learner’s permit.”
Smart Summon is the clearest expression yet of Tesla’s approach to autonomy. Parking lots are notoriously difficult for computers. There are no hard rules. Cars regularly drive across road markings or along the wrong side of traffic. Obstacles abound: pedestrians, cars, shopping carts, strollers. Getting Smart Summon right requires a deep understanding of human behavior. The feature may also help Tesla figure out where people like to get dropped off and picked up, which will become critical data if the company tries to launch an automated taxi service.
Some safety advocates criticize Tesla for using its customers as guinea pigs. Musk says that real-world training is the quickest path—and possibly the only one—to ending the carnage of car crashes, which last year killed 40,000 Americans and 1.4 million people worldwide. For Smart Summon, at least, mistakes happen at parking-lot speeds and are less likely to result in serious harm.
In the first month after its release, drivers used Smart Summon more than a million times, Musk reported on an Oct. 24 earnings call. Tesla’s “massive fleet” of Autopilot-equipped cars “allows us to check these corner cases and learn from them,” Musk said, promising a significant software upgrade “in the coming weeks.”
Below are more than 800 owner comments gathered by the survey after the initial release of Smart Summon. The comments are sorted according to the owner’s evaluation of its usefulness and reliability.
“While the software is definitely beta right now, it is good enough to be used under driver supervision. It is a definite precursor to full self driving and will get more reliable over time the same way Navigate on Autopilot did.”
“Good for bad weather situations (pouring rain) or hauling larger items. Used it at Home Depot to load 2x4 rather than lugging them all the way to the car in the back of the parking lot.”
“On my first attempt (in front of a bunch of work people) the car decided to drive over a red curb and into a median. The rim got some red curb rash & the rocker panel was damaged.”
“It’s cool and exciting to see, but it’s not something I’d try using in a busy parking lot at this point. It’s not fast enough, both speed and decision making.”
“My wife and I are different levels of mobility impaired. Summon was one reason we purchased Tesla. Having the car come get us in a slippery or icy parking lot reduces our chance for falling. We’ve tried it in several lots and while it could be more graceful, it has met expectations.”
“It’s better than many humans I know. Alas, that’s damning with faint praise.”
“The car detected a pile-up in fog and applied the brakes/alerted driver and began a lane change to avoid it before I took over. I believe it saved my life.”
Driver from California who rates Autopilot's overall safety ★★★★★