Following the news that a driver was killed in a Tesla Model S crash in Florida in May when the vehicle had its “Autopilot” semi-autonomous driving mode engaged, another report of an non-fatal Autopilot incident emerged in Pennsylvania.
Tesla reported the Florida crash to the National Highway Traffic Safety Administration (NHTSA), and all the detail we have on the Pennsylvania accident comes from this report in the Detroit Free Press.
However, given how widely touted and enthusiastically received Autopilot was when it arrived last year, and given the the technology is available on new Tesla Model S and Model X vehicles, it might be time for Tesla to consider intensifying the amount of instruction it provides owners on the limits of Autopilot.
Currently, drivers have to accept responsibility for activating Autopilot in the cars; the technology is still in beta-testing, according to Tesla.
But going that route doesn’t seem to be enough. Other major car makers that have been working on similar semi-self-driving tech have been much more circumspect about a public rollout. Tesla has a head start, but with the recent Autopilot tragedy and mishaps, CEO Elon Musk’s startup electric car company may be questioning whether its was worth it.
There is sn easy, though relatively time-consuming fix.
There’s also a drastic fix, but no one is talking about that one yet: deactivating Autopilot via an over-the-air software adjustment.
It could be in Tesla’s best long-term interest to undertake the time-consuming fix: Autopilot 101.
Driving a Tesla is like driving a “normal” car, but it’s also like getting used to a new computer or smartphone. And with consumer electronics devices, people don’t tend to read the owner’s manual — rather, they figure out the gadget as they go along.
But an iPhone 6 can’t drive itself on a freeway at 65 miles per hour.
So a higher level of instruction could be required.
This could constitute a visit to a Tesla store for 30 minutes of Autopilot dos and don’ts. It could also be a protocol that Tesla adds for all new vehicle purchases, as a requirement for Autopilot activation.
And I think that Tesla could temporarily deactivate Autopilot across its fleets and require owners to run through a training process, possibly even in their cars using the central touchscreen interface and a pre-loaded, voice-oriented guide, before they “qualify” to have the tech turned back on.
The key issue with Autopilot is twofold.
First, Tesla says that the driver should keep his or her hands on the wheel at all times while Autopilot is active. The steering wheel periodically vibrates, warning of an Autopilot disconnect, if the driver goes hands-off for too long.
But the temptation to let your hands do something else while the car is Autopiloting is too great.
Second, drivers don’t seem to get that they’re participating in an experiment in partial self-driving — really just very sophisticated highway cruise control — not sitting in a vehicle that can truly drive itself. Tesla might want to dial the tech back a notch to dispel that illusion and force drivers to be more engaged.
It’s not clear that Tesla will do any of this, but if Autopilot incidents become more numerous, it may not have a choice. The government may step in and decide that beta-testing a technology than can be deadly if misused isn’t something that it will tolerate.
Tesla has a chance to get ahead of that. It should take it.
Business Insider Emails & Alerts
Site highlights each day to your inbox.