Washington, DC (CNN)Tesla has released an early version of its “full self-driving” software to a small group of Tesla enthusiasts, who appear to be both delighted and alarmed by what they’ve experienced so far.
“Turn left. Come on. What are you doing?” said one frustrated Tesla owner as his car appeared slow to change lanes during a trip he posted on YouTube last week. “I swear I’m not drunk you guys, I’m not drunk, it’s my car.”
Tesla has long promoted the idea that its vehicles will someday be able to drive passengers from location to location without human intervention. While many auto manufacturers have invested heavily in automated features, Tesla has been the most bullish, pushing out features before other brands, such as lane-keeping on highways and traffic light detection. Now Tesla’s long-promised “full self-driving” functionality has arrived — sort of. The company has provided an unfinished version of the software to a group of beta testers, a term used in the software industry to refer to users who test incomplete software and often provide feedback for improvements.
The beta version of “full self-driving” was released on Oct. 20. Since then, at least six beta testers have posted footage on social media showcasing their early impressions of the software. CNN Business reviewed hours of footage and found early impressions of the software are a mixed bag. At times the testers are impressed with the “full self-driving” technology, in other cases they say it’s overly cautious. The videos also show unsafe situations that appear to result from the car not understanding traffic well enough.
Brandon McGowen, one of the beta testers, has posted videos online in which his Tesla nearly drives off the road or into a median. He’s not the only driver who claims to have experienced trouble while testing the software. Beta testers Zeb Hallock, James Locke, Rafael Santoni, Kim Paquette, and a YouTuber who goes by “Tesla Raj,” have highlighted concerns. In videos reviewed by CNN Business, Teslas appear to blow through red lights, stop well short of intersections, miss turns, nearly rear-end a parked car, make a turn from the wrong lane, and speed over speed bumps.
Tesla CEO Elon Musk said in August that full self-driving’s improvement would be a “quantum leap” from Tesla’s current assisted driving options, and when commuting with an early version of the technology, said he was having almost no “interventions,” Tesla-speak for moments when a driver must take over for the automated driving software.
The automotive executive has repeatedly predicted the capability of Tesla’s self-driving technology. He said in 2016 that a Tesla would be able to drive from Los Angeles to New York by the end of 2017 “without the need for a single touch” of the wheel.
In 2016, Tesla said that its new cars all came with the hardware necessary for “full self-driving,” and that it only needed software to truly make “full self-driving” a reality. This software could be delivered to Teslas remotely, through over-the-air software updates. At that time, in 2016, Tesla also began selling customers a “full self-driving” option, initially for $3,000. The feature costs $10,000 today, following an additional $2,000 price increase this week. The software isn’t an add-on, it just gets delivered later.
Musk said last year that the company would have self-driving robotaxis operating this year.
Tesla’s use of the term “full self-driving” has long been controversial, and criticized by autonomous vehicle experts. To most experts, full self-driving means a car in which a person could safely fall asleep behind the wheel. An attentive human driver isn’t needed.
However, the experiences of drivers beta testing the early version of “full self-driving,” who decided to post footage online, so far suggest there’s much work to be done before attentive humans aren’t needed behind the wheel. It’s unclear how many beta testers there are.
Tesla’s technology can’t be expected to immediately match human performance yet, as the artificial intelligence-powered system needs real-world experience to gather data, learn and improve. But having a raw, student driver on roads also raises questions of whether Tesla is doing enough to mitigate safety risks.
Bryan Reimer, who leads MIT’s Advanced Vehicle Technology Consortium, which studies advanced driver assist systems like Tesla’s Autopilot, told CNN Business that humans aren’t equipped to oversee automation without support, and a camera-based driver monitoring system is needed at minimum to mitigate risks associated with Tesla’s full self-driving software.
“This is an experiment on the development of automation with participants that haven’t consented,” said Reimer, referring to pedestrians, cyclists and other drivers who may be at risk from sharing roads with inattentive drivers using Tesla’s technology. Reimer’s research has found that Tesla drivers are more distracted when using its driver assist technology.
Tesla did not respond to a request for comment from CNN Business. Tesla says on its website that “active safety features are designed to assist drivers, but cannot respond in every situation. It is your responsibility to stay alert, drive safely and be in control of your car at all times.”
The National Highway Traffic Safety Administration views Tesla’s latest technology as a driver assistance system, which requires a fully attentive human driver. A NHTSA spokeswoman said in a statement sent to CNN Business that the agency will not hesitate to take action to protect the public against unreasonable risks to safety.
CNN Business has not reviewed in a video in which the Tesla beta test drivers describe receiving training from Tesla, and Tesla did not respond to a question on whether they provide training to these drivers. Musk has tweeted that the full self-driving beta is limited to “a small number of people who are expert & careful drivers.”
Tesla has warned current drivers to pay extra attention to the road, and keep their hands on the wheel.
“Do not become complacent,” Tesla warned the drivers in a message displayed when they installed the software, which CNN viewed in multiple videos posted by people testing the software. “It may do the wrong thing at the worst time.”
Tesla released the unfinished “full self-driving” software to a select group of Tesla owners last week, though it is yet unclear how the group was selected. Drivers posting video clips of “full self-driving” say that it’s significantly better than the last version of Autopilot they had. A broader rollout of Tesla’s “full self-driving” is expected as Tesla improves the software, though when that may be is unclear.
One Tesla owner who has posted “full self-driving” videos told CNN Business that he could not talk directly to the media about “full self-driving” because of a nondisclosure agreement with Tesla. CNN Business could not independently verify what agreements the beta testers have with Tesla, and if they include NDAs. Other Tesla owners who have posted videos, including all of those mentioned in this story, did not respond to inquiries from CNN Business. Locke, one of the testers who has posted “full self-driving” videos online, tweeted a thank you to Musk and Tesla for allowing beta testers to share their experience on social media.
“Very helpful. Thanks all beta testers!,” Musk tweeted in response.
Some of the drivers have said on social media that they’ve been amazed and impressed by what Tesla’s full self-driving technology is capable of.
Seemingly mundane maneuvers for human drivers, like circling a roundabout, have drawn intense praise from the Tesla owners and friends they’ve allowed to ride with them.
“Oh my god. Wow, that was impressive. Wow. That was impressive. It just went through!” exclaimed a passenger with Locke, when his Tesla entered a roundabout and took the correct exit.
On a narrow street, Paquette said, in a video posted to Twitter, that her Tesla automatically tucked in its side mirrors so that it wouldn’t scrape other cars. The cars running the beta version of “full self-driving” have also demonstrated that they’re capable of giving bicyclists a wide berth while passing.
The drivers are reporting back to Tesla when the software makes mistakes, so that Tesla can make improvements, according to Paquette and other beta testers who mentioned the reports in their videos. Drivers hit a button on their dashboard immediately after incidents they wish to report, according to multiple videos viewed by CNN Business. There’s also an email address they can use, according to an interview Paquette gave with Talking Tesla, a podcast about the automaker and Musk’s other companies. The cars, for example, have shown a pattern of coming to a full stop when entering a roundabout, even when no car is blocking their path. Videos show that full self-driving often slows for speed humps, but won’t necessarily slow down for speed bumps. In at least one case, the Tesla “full self-driving” software appeared to confuse a one-way street for a two-way street, according to the video.
Paquette estimated in her Talking Tesla interview that her Tesla might be as good a driver as her, if she’d had “maybe three bourbons.”
Tesla owners have been willing to make sacrifices to test the unfinished software. McGowen said in another YouTube video that a safety feature on his Tesla, automatic emergency braking, had to be disabled so he could participate. McGowen said that he was willing to accept the risk.
“I want people to be able to get very good stable software and safe software once this is ready,” he said.
Hallock, a YouTuber who lives in North Carolina, said in a video that Tesla isn’t paying him for doing the testing.
“I just want to share the experience, as Tesla has asked me to do,” Hallock said.
Some Tesla owners with “full self-driving” report that they’ve already received an updated version of the software, and seen improvements.
“It’s just driving so much more natural, I love it,” McGowen said in a video he posted Wednesday. He then marveled at how much better his car could turn into a Target parking lot compared with his earliest tests using the first version of “full self-driving.”
Fewer than six minutes later, in the same video, McGowen had to grab the wheel and disengage full self-driving to prevent the car from driving off the road.
“Yeah, it’s not doing well at night,” McGowen said.