- The driver wass using Tesla’s Autopilot system when he crashed into the police vehicle, Michigan State Police said.
- There were no injuries, according to police.
- A federal agency is investigating a Tesla crash in Michigan last week that left two people hospitalized.
- See more stories on Insider’s business page.
A man driving a Tesla Model Y with Autopilot engaged crashed into a Michigan State Police car that had pulled over with its emergency lights on, the agency said Wednesday on Twitter.
-MSP First District (@MSPFirstDist) March 17, 2021
The 22-year-old driver was issued citations for failure to move over and driving with a suspended license, according to the Michigan State Police.
Police were investigating a crash between another vehicle and a deer when the patrol car was struck early Wednesday. There were no inuries, police said.
Tesla didn’t respond to a request for comment from Insider.
This is the second Tesla collision in Michigan recently.
Detroit Police told Reuters that a Tesla sedan was driving through an intersection in southwestern Detroit last week before it crashed into a tractor-trailer. Both the driver and the Tesla passenger were transferred to a hospital, Reuters reported.
Detroit police couldn’t determine whether the driver was using Tesla’s Autopilot or “full self-driving” software, according to the Associated Press.
The National Highway Traffic Safety Administration said that it is investigating the crash, Insider reported.
NHTSA didn’t respond to Insider’s request for comment.
Tesla’s Autopilot system allows the car to brake, accelerate, and steer automatically. The electric car maker also sells its full self-driving software as a $US10,000 ($12,956) one-off add-on and plans to release it as a subscription model this summer. FSD allows cars to park themselves, change lanes, and identify both stop signs and traffic lights.
Neither Autopilot nor FSD makes a Tesla car fully autonomous. At least three drivers have died while using Tesla’s Autopilot.
Last month, the National Transportation Safety Board called for clear regulations for automated driving software and cited Tesla, saying that the automaker is testing systems “with limited oversight or reporting requirements.”