Driver Error And Autopilot Shortcomings Blamed For Fatal Tesla Crash

The National Transportation Safety Board (NTSB) has concluded that a fatal Tesla crash in 2016 was caused by a lack of Autopilot safeguards, driver inattention and an overreliance on the semi-autonomous driving technology.

The NTSB’s investigation took more than a year and it was determined that Autopilot “played a major role” in the fatal crash involving Joshua Brown. The investigation found that the technology lacks significant safeguards to ensure it is used correctly and that Brown was able to use it on a road where it shouldn’t have been operational.

“System safeguards were lacking,” said NTSB chairman Robert Sumwalt. “Tesla allowed the driver to use the system outside of the environment for which it was designed and the system gave far too much leeway to the driver to divert his attention.”

According to the NTSB, the Autopilot system functioned as intended but failed to ensure the driver paid adequate attention to the road. Blame has also been placed on Brown’s inattentiveness, his overreliance on Tesla’s self-driving aid and a truck driver’s failure to yield for Brown.

In 2016, Brown was killed near Williston, Florida after his Tesla Model S slammed into the side of a truck that pulled out in front of him. It was determined that Autopilot failed to detect the cross traffic and that both Brown and the truck driver had at least 10 seconds to “observe and respond to each other” to avoid a collision.

In response to the NTSB’s conclusions, Tesla said it will continue to ensure owners know that Autopilot isn’t a fully-autonomous system.

“We will also continue to be extremely clear with current and potential customers that Autopilot is not a fully self-driving technology and drivers need to remain attentive at all times,” the company said.

The NTSB has recommended that automakers monitor driver attention in semi-autonomous and autonomous vehicles through more than simple steering-wheel engagement, Reuters reports.


  • Kash

    So instead of holding the driver accountable for misusing the system, and the semi truck for failing to yield which is the true cause of this accident, the company is at fault because the system wasn’t enough? What’s next, holding a company accountable when someone slams into the rear end of another car because the car didn’t have autonomous braking?

    The minute a consumer misuses a system the company cannot possibly be held responsible for anything that happens, the consumer misused the product.

    • bxniels0

      Do you think it’s fair that they call it Autopilot?

      • Kash

        I think it’s poor marketing and MAYBE false advertising but the driver agreed to the terms and conditions when he activated the system. If he chose not to read and understand how the system works and what all the system can and can’t do before using it then he’s an idiot. If the excuse is “Well it’s called Autopilot” and you’re simply going off the name, you shouldn’t be walking around or making decisions unsupervised.

        Do you also believe Burger King when they say “You can have it your way!”? If you go to Burger King and place a huge order and then tell them you want it for free, because “that’s the way [you] want it” they’re gonna laugh at you and tell you “that’s not how it works” and they’d be right because somewhere in that slogan there’s an asterisk that corresponds to a legal statement about why you can’t really have it your way, at least not for free. So should BK be responsible for paying for your food?

        You assumed the system could do all these things just because it’s called “Autopilot” and that’s your fault for not reading about the system past the name. Unless Tesla outright said “the car can drive itself, you can nap while using it, it’s a level 5 autonomous system” they are, 100% not liable for injuries caused by your failure to properly use the system.

        Even so, Elon Musk could’ve stood on stage and say all that about the system, but if the legal disclaimer that you have to agree prior to using the system says otherwise the most you could get Tesla for is false advertising because the legal document you signed or rather acknowledged that you read didn’t say anything about level 5 autonomy.

        iirc, the legal disclaimer you acknowledge before using Autopilot says something along the lines of “must keep both hands on the wheel” and “must maintain full attention on the road” along with some other stuff, followed by “while the system is in use” and probably a little clause that reads like “failure to do ***ALL*** of these things voids any legal responsibility Tesla may have had for any accidents, injuries, deaths, etc. that occur while the system is in use” and I’m pretty sure you have to agree to this every time you activate the system.

        TL;DR doesn’t matter what you call it, all that matters is what the legal disclaimer says.

        I’m all for holding companies responsible for the things they do (i.e. GM and their ignition problems, Takata and their airbags, etc.) but consumers need to be held responsible for their own actions just as much.

        • steve

          The name is wrong – end of.
          Needs to be called something different to Autopilot until it is a fully autonomous system.
          Only really intelligent car enthusiasts and technology people understand the different levels of car autonomy. Average, dull Joe thinks Autopilot is just that.

          • Kash

            Again, I don’t agree with the name either, i prefer Cadillac’s “Super Cruise” name because it does convey the actual capabilities of the system better than Tesla’s, but that still doesn’t change the fact, the driver misused the system and we can’t spend all day trying to find names that won’t confuse or send mixed signals about things.

            It’s like trying to avoid offending anyone, you’ll never not offend someone. Someone, somewhere will always be offended by whatever you say. Someone, somewhere a moron will look at the name of something and assume it’s for something else or can do something else, not read the directions or research the item, then get mad or hurt himself when they misuse the item because it doesn’t work like they thought it would.

            Growing up (early 00’s) I remember a news story about a guy who was driving an RV, turned on Cruise Control, got up and went to make a snack or something, crashes then gets mad because he thought Cruise Control meant he could leave it to do its own thing and the RV would just drive it self completely, like an airplane.

            This is like 2001-2002 mind you, cruise control wasn’t a new thing, it was in just about every car on sale at the same, but somehow, this guy didn’t know how it worked, or just said he didn’t know how it worked. There will always be people like this and we cannot continue to pander to them and their ignorance, we’re not helping anyone if we do, but we are hurting everyone if we do.

          • steve

            I fully understand what you are saying – but the term Autopilot is really linked to aircraft not cars, in the first place.
            Hence – Auto Pilot.
            This means that if you ask anybody what the word “Autopilot” means (without mentioning whether it’s for a car or aircraft) – they will instinctively tell you it’s a switch that somebody presses that enables the craft to fully drive/fly itself – without any pilot monitoring or intervention.
            Hence, in Tesla’s case – Autopilot does not do what 99% of people understand that word to mean.
            Disclaimers are all very well – but they are never there to help the product owner – they are just there to make the manufacturer less likely to be sued.

          • Kash

            Again, not arguing about the name, I think they should change it, but at this point the name doesn’t matter, the phrase “A rose by any other name” comes to mind. Are we going to be having this conversation when someone does the same thing in Cadillac’s system? Are we going to be saying “Well the name implies X”? and if so where do we draw the line? I say screw the line, if everything is going to have a legal disclaimer attached to it anymore anyway why does the name even matter?

            Well yeah, the manufacturer is trying to make a profit and without disclaimers they’d be fielding 100’s of lawsuits daily, lawsuits cost money, money cuts into profits. We’re talking about businesses after all.

            Why do we need a disclaimer on a toaster that says not to put it in a bathtub full of water? or one on a curling iron that says not to put it right against your neck or up your bum while its turned on and all the way up? People did those things, hurt themselves, and sued even though there was no implication you could do any of those things without harm or damage to the user or item itself.

          • exeptor

            The mass usage of word “Autopilot” (at least in the near past) was in planes. Even there when Autopilot is ON pilots keep looking of what is happening even if they are in the air where the traffic is not the same as on the road. Then why “dull Joe” (I’m not sure that someone who can and want to have a Tesla is a “dull Joe”) assumes that Autopilot means reading books (or watching movies) while traveling with 70 mph. It is an issue with the modern society of how it acts in every area of live.

    • Whilst I agree, I believe the issue is to do with the severe lack of safeguarding the system to prevent users from doing this. As much as the T&Cs is for the customer to agree to, Tesla themselves also want people to experiment to gather back feedback for further research. Some simple delayed bongs and vibrations isn’t enough to prevent the driver from becoming distracted and take their eyes off the road and hands off the wheel.

      Granted, this is a very rare case of Autopilot being blamed, but even so, when the car can take care of the most vital systems of itself, simple warnings need to backed by i.e. quicker deactivation of Autopilot, restricting certain use of the infotainment. Or even using iris scanning to detect where / what the driver is looking at and acting accordingly if they have been looking away from the road for too long, like an upcoming intersection, lane changing etc…

      When Autopilot is being used, there isn’t any electronic fail safe so the driver has to be. Just like it would be on an aeroplane.

      • Kash

        Shortly after this accident Tesla did release a software update (8.0) that changed what happens when the driver takes their hands off the wheel, or rather how the car deals with inattentive drivers. The system can actually block the driver from reengaging the system now.

        • Ah, yes I remember. Still though, whilst it’s an improvement, it doesn’t actually combat any of the issues when the system is on and with people misusing it, only after.

  • steve

    It doesn’t help that Tesla call the system “Autopilot” – that name simply suggests that it’s a fully automatic system

    • exeptor

      I just can’t agree that the word “Autopilot” means full autonomous. Even in planes (where it was mostly used), where the entire flight and even landing is done by the computers, it doesn’t mean that the pilots should not look closely of what is happening. Full autonomous (for each person perspective) is only when someone else is driving you :). Only then you are fully allowed to do what you want and not looking the road.

  • el brago

    I fully understand what you are saying – but the term Autopilot is really linked to aircraft not cars, in the first place.Hence – Auto Pilot.This means that if you ask anybody what the word “Autopilot” means (without mentioning whether it’s for a car or aircraft) – they will instinctively tell you it’s a switch that somebody presses that enables the craft to fully drive/fly itself – without any pilot monitoring or intervention.Hence, in Tesla’s case – Autopilot does not do what 99% of people understand that word to mean. Disclaimers are all very well – but they are never there to help the product owner – they are just there to make the manufacturer less likely to be sued.

    • exeptor

      Let’s assume that you are right (I really can’t say what 99% of people think Autopilot is). We have a person which in my opinion is at least an average level of intelligence who is not a teenage and who have experience with cars in the past. This person goes to Tesla and they tell him “Look we have an Autopilot which can drive instead of you” and this person assumes without any doubt that he can just click a button and forgot where he is and what was the normal acting when your are in a car. I just can’t agree with this.

    • steve

      Why have you copied my post from earlier – word for word?

  • neil

    Looking at the damage to both car and truck, the lack of any under-run protection on the side of the truck trailer should also be cited as a contributing factor to the seriousness of the outcome of the collision. The car’s crash structure appears intact, the impact having occurred well above the car’s belt line. Had the same collision occurred in Europe, the car’s crash structure would have met with the trailer’s side guards, and the collision could well have been survivable.

    • TotallyDisqusted

      Under-run protection isn’t mandated or even used at all in the US. That has nothing to do with fault in the crash.

  • BlackPegasus

    I wouldn’t want to be the first responders at this accident scene. Looks like the driver may have been decapitated. 😖

  • Shobin Drogan

    I don’t know how much can you even trust Tesla’s words. I’ve seen so many videos of its autopilot driving like it’s drunk and no alarms to let the driver know it’s going out of the lane. Then Tesla would just say it’s driver error despite being in autopilot. The time autopilot disengages takes way more concentration to grab the wheel then to just drive the car. On top of that the guy who died in the accident was documenting videos on Tesla’s autopilot systems so he obviously know it’s strengths and limitations.

  • gary4205

    Charge Elon Musk with murder by depraved indifference.

    Sadly….California no longer has the death penalty. But Florida does, as does the federal government.

    Someone must pay dearly for Tesla using it’s customers as beta testers.

    Make the executions public!

    • LWOAP

      You need to chill out.