The National Transportation Safety Board has concluded that a a Model S had Autopilot enabled and the driver’s hands were off the wheel when it slammed into a fire truck in Culver City, California in 2018.
The NTSB is investigating a series of crashes involving Tesla’s drive assist system. Its investigation into the 2018 Californian crash has determined that Autopilot was engaged continuously for the last 13 minutes and 48 seconds of the trip. Additionally, it has been revealed that the driver kept his hands off the wheel for all but 51 seconds of that time, despite receiving multiple warnings from the car to put his hands back on the steering wheel.
Also Read: Tesla Model S Crashes Into Fire Truck With Autopilot Reportedly Engaged
Reuters reports the Tesla was following a vehicle at approximately 21 mph (33.8 km/h) when the vehicle ahead quickly changed lanes. With the lane ahead now free, the car began accelerating up to the driver-set cruise control speed of 80 mph (130 km/h), but failed to notice the fire truck up ahead and slammed into it at 30.9 mph (49.7 km/h).
“I was having a coffee and a bagel. And all I remember, that truck, and then I just saw the boom in my face and that was it,” the Tesla driver said during investigations.
The driver doesn’t remember exactly what he was doing at the time of the crash, but suggested he may have been changing the radio station or drinking a coffee. The Autopilot did not detect his hands on the wheel for the final 3 minutes and 41 seconds leading up to the crash.
A final thought: despite its name, which can mislead customers, Autopilot is NOT a Level 4, fully autonomous, system, but can perform certain functions for you. Yet, it is not infallible and Tesla states that the driver is always to keep their hands at the wheel, monitor the situation and be ready to intervene if needed. We’re still some way before we can safely drink our beverage, watch a video or whatever safe in the knowledge that the car will drive itself – and that applies to all carmakers’ systems that are on the market.
This is not taking sides – just common sense. Maybe someone should have intervened and forced Tesla to ditch the Autopilot name to something else, so that drivers weren’t confused, but we guess now it’s too late…
View Comments
Color me shocked
I would say it's time to stop Tesla from playing with people's lives.
Tesla? The government is allowing them to do it.. the government sould tell them to stop but they don’t care right now. Maybe after 5-10 more deaths...
Autopilot is an aid, though (technically) and so this is still partially the drivers fault
Why governments let Tesla sell autopilot when they know that it doesn’t work?
It’s pretty simple. They’re are old and don’t know it doesn’t work. No auto pilot system should be released to the public. Or better yet you need a poses a certain license in order to drive a vehicle with it.
your last point is a great idea actually.
“I was having a coffee and a bagel. And all I remember, that truck, and then I just saw the boom in my face and that was it,” the Tesla driver said during investigations.
I mean, ffs. You were DRIVING A CAR.
What next?
"I was knocking one out when I hit the car in front as I came"?
So the "Autopilot" didn't stop the car in 3 mins and 41 secs? Other systems brake to a full stop when the driver doesn't touch the steering wheel for 30 secs / 1 min.
I'd still blame Tesla for not making sure the driver drives the car.
Wow another Tesla crash blamed because of autopilot. I am SO surprised.
"hands were off the wheel"
are these owners trying to kill themselves? I mean seriously wtf?!
Coffee and a bagel. While driving on a freeway. MORE proof that Tesla buyers are not the brightest, by a long shot.