- Tesla released a beta version of its “full self driving” software to some customers this month.
- Within days, videos on the internet showed cars easily navigating roads and intersections.
- However, plenty of others showed situations where a momentary lapse of attention by the driver could have spelled disaster.
- Despite its name, no car currently for sale is actually self-driving, the US government says, and Tesla has been criticised over its driver assistance features – and their branding – before.
- Visit Business Insider’s homepage for more stories.
There’s still no such thing as a self-driving car, but you wouldn’t know that from hearing Elon Musk or Tesla talk about their latest software.
Earlier this month, the company began to release a beta version of the driver-assistance program it calls “full self driving.” Despite word from Musk that the rollout would be “extremely slow & cautious, as it should,” videos of the software encountering situations where it clearly was not up to the task without driver intervention quickly surfaced online.
In one video posted to YouTube Monday, a Model 3 successfully navigates one stop sign, before another left turn almost sends it straight into a parked car: “A good example of how this is still beta,” the driver says.
Over the course of 8 minutes, the drone video shows a handful of occasions where the driver has to take control back from the car’s computer to avoid a crash or breaking road rules.
A plethora of other recordings uploaded to YouTube, Twitter, and elsewhere in the week since FSD was released show just how misleading the name is, even as its predecessor, Autopilot, was criticised by industry experts for the same exact reason. In some cases, the mistakes are simple and easily avoided, like missing a median:
4. Map challenges ????
FSD appears to not detect this median, and thus tries to drive down the wrong side of the road.
Is this an “edge case” to iron out, or is it a monstrously large technical challenge to infer road rules in real-time? pic.twitter.com/zmxlLAA1gz
— Oliver Cameron (@olivercameron) October 24, 2020
Or using the wrong blinker and struggling with a traffic circle (and let’s face it, many American humans struggle sometimes in roundabouts).
After the software was released, the US National Highway Traffic Safety Administration said bluntly: “no vehicle available for purchase today is capable of driving itself.”
The agency is investigating Autopilot’s role in at least 13 crashes in the past four years. At least three of those have resulted in deaths, while countless other instances of users misusing the product have been caught on video.
It’s not clear when a wider rollout of FSD may move beyond beta-testing.
As for the criticisms of its name and imperfect status, Tesla did not respond to a request for comment. Musk has regularly pushed back against critics of the software.
“It’s not like, ‘If you just introduced a different name, I would have really treated it differently,'” he said in August. “If something goes wrong with Autopilot, it’s because someone is misusing it and using it directly contrary to how we’ve said it should be used.”