The software is inconsistent at best, according to interviews with owners of Tesla with “full self-driving,” as well as a review of more than 50 videos posted on social media by members of the public who have been using versions of it since it was rolled out to about 1,000 owners in early October. The videos are believed to be authentic because of the presence of details typical of “full self-driving” use, the complexity of manipulating such a video and the social media histories of the video creators, who are generally Tesla enthusiasts. Tesla did not dispute the authenticity of the videos.
“It drove like a 9-year-old who had only driven in [Grand Theft Auto] before, and got behind the wheel,” said John Bernal, who owns a Tesla Model 3, of when he first got “full self-driving” early this year. “Now I feel like I’m driving with my grandma. Sometimes it might make a mistake, like, ‘no grandma, that’s a one-way, sorry.'”
Tesla did not respond to a request for comment and generally does not engage with the professional news media. It warns drivers that the technology “may do the wrong thing at the worst time, so you must always keep your hands on the wheel and pay extra attention to the road.” Drivers are told to be prepared to act immediately, especially around blind corners, intersections and narrow situations.
Some Tesla drivers say they’re concerned the feature’s inconsistent behavior is sometimes annoying and rude for other drivers. Videos posted online show it’s common for cars in “full self-driving” to drive down the middle of unmarked residential streets, in no apparent rush to move over for traffic coming towards it.
The cars also appear to befuddle drivers in other situations, such as being slow to take its turn at a four-way stop.
Kim Paquette, one of the first non-Tesla employees to test “full self-driving” when it was rolled out to a select group a year ago, says she uses the feature for nearly all of her driving in her Tesla Model 3. She was frustrated when she recently had to drive a loaner car that didn’t have the technology she’s grown used to. Paquette said she can sometimes drive the 85 miles from her home to her job at Boston’s airport without having to intervene because the car made a mistake.
Paquette can type an address into the screen on her Model 3, or hit a button and use Tesla’s voice recognition to tell the car her destination. Then she pulls down twice on a stalk on the steering wheel to activate “full self-driving.” The car lets out a chime, a blue steering wheel logo lights up on her screen, and the car starts taking her where she wants to go.
In some ways a system like this can seem like magic. But that magic is still flawed in both minor and serious ways.
Paquette has been frustrated, for instance, with her car’s tendency to drive in the parking lane on one-way streets in Newport, Rhode Island, where she lives.
But in most cases it is overly cautious around pedestrians, drivers say. Paquette recalled a recent drive in which she was cruising down a street as a person got out of a parked car. Paquette said her car stopped four car lengths behind the parked vehicle and exiting driver. To Paquette, it seemed clear the person exiting their car was going to walk to the adjacent sidewalk, rather than cross in front of her. The car could be cautious without leaving such a large gap, she felt.
She’s noticed that “full self-driving” struggles to sense social cues, including being waved through a four-way stop by another driver, or knowing what a pedestrian will do next. Paquette said she regularly takes manual control of the car to prevent it from making the wrong decision or irritating other drivers.
“If someone is standing on the corner, are they just standing on the corner or waiting to cross the street?” she said. “It’s a student driver for sure. It’s like teaching a 15-year-old.”
Tesla isn’t alone in struggling to get its cars to recognize social cues. Machines work best in predictable environments that lack complexity, and this has been a challenge for all autonomous vehicle developers.