Tesla Driver Dies After Fatal Crash In Autopilot Mode, Raises Questions On Autonomous Systems

A self-driving Tesla Model S was involved in an accident that occurred on May 7, resulting in the death of the driver. While semi and fully-autonomous cars have crashed before, this is believed to be the first death related to a vehicle with a semi-autonomous driving feature engaged.

The crash happened last month in the small city of Williston in Florida, with Ohio resident Joshua Brown, 45, in the driver’s seat of his 2015 Model S with Autopilot activated, when an 18-wheel semi made a left turn in front of the electric car. Brown died at the scene when “the car’s roof struck the underside of the trailer as it passed under the trailer”, the Levy Journal reported.

Brown was an active member of the Tesla community having uploaded many videos on his YouTube Channel, including ones showing the Autopilot system (see below), with one event capturing the attention of Tesla CEO, Elon Musk, who tweeted about it on his account.

In a brief statement, the National Highway Traffic Safety Administration (NHTSA) said Thursday it was aware of the accident and has launched a preliminary investigation sending a team to examine both the car and the crash site in Florida.

“ODI has identified, from information provided by Tesla and from other sources, a report of a fatal highway crash involving a 2015 Tesla Model S operating with automated driving systems (“Autopilot”) activated,” said the NHTSA. “This preliminary evaluation is being opened to examine the design and performance of any automated driving systems in use at the time of the crash.”

Tesla responded to the news with a blog post titled “A Tragic Loss” that starts off in the first paragraph by stating it was the first known Autopilot death in some 130 million miles driven by its customers, and continues saying that while the autonomous system is getting better, it’s not yet perfect and still requires the driver to remain alert. According to Tesla, “neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied”.

Here’s Tesla’s statement in full:

“We learned yesterday evening that NHTSA is opening a preliminary evaluation into the performance of Autopilot during a recent fatal crash that occurred in a Model S. This is the first known fatality in just over 130 million miles where Autopilot was activated. Among all vehicles in the US, there is a fatality every 94 million miles. Worldwide, there is a fatality approximately every 60 million miles. It is important to emphasize that the NHTSA action is simply a preliminary evaluation to determine whether the system worked according to expectations.

Following our standard practice, Tesla informed NHTSA about the incident immediately after it occurred. What we know is that the vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the Model S. Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied. The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S. Had the Model S impacted the front or rear of the trailer, even at high speed, its advanced crash safety system would likely have prevented serious injury as it has in numerous other similar incidents.

It is important to note that Tesla disables Autopilot by default and requires explicit acknowledgement that the system is new technology and still in a public beta phase before it can be enabled. When drivers activate Autopilot, the acknowledgment box explains, among other things, that Autopilot “is an assist feature that requires you to keep your hands on the steering wheel at all times,” and that “you need to maintain control and responsibility for your vehicle” while using it. Additionally, every time that Autopilot is engaged, the car reminds the driver to “Always keep your hands on the wheel. Be prepared to take over at any time.” The system also makes frequent checks to ensure that the driver’s hands remain on the wheel and provides visual and audible alerts if hands-on is not detected. It then gradually slows down the car until hands-on is detected again.

We do this to ensure that every time the feature is used, it is used as safely as possible. As more real-world miles accumulate and the software logic accounts for increasingly rare events, the probability of injury will keep decreasing. Autopilot is getting better all the time, but it is not perfect and still requires the driver to remain alert. Nonetheless, when used in conjunction with driver oversight, the data is unequivocal that Autopilot reduces driver workload and results in a statistically significant improvement in safety when compared to purely manual driving.

The customer who died in this crash had a loving family and we are beyond saddened by their loss. He was a friend to Tesla and the broader EV community, a person who spent his life focused on innovation and the promise of technology and who believed strongly in Tesla’s mission. We would like to extend our deepest sympathies to his family and friends.”

While there’s an ongoing investigation, the accident is bound to raise some legit questions about self-driving vehicles and whether autonomous systems (and today’s drivers) are ready for prime-time. Clarence Ditlow, executive director of the Center for Auto Safety, told Bloomberg that if the Autopilot system did not recognize the semi-truck, then Tesla must recall any vehicles equipped with this system.

“That’s a clear-cut defect and there should be a recall,” Ditlow said. “When you put Autopilot in a vehicle, you’re telling people to trust the system even if there is lawyerly warning to keep your hands on the wheel.”

Eric Noble, president of CarLab Inc., a consulting firm in Orange, California, had harsher words for Tesla, which said in its posting that Autopilot “is a new technology and still in public beta phase”. Noble told Bloomberg that “No other automaker sells unproven technology to customers”.

“There’s not an experienced automaker out there who will let this kind of technology on the road in the hands of consumers without further testing,” Noble told the news agency. “They will test it over millions of miles with trained drivers, not with consumers.”


  • Tumbi Mtika

    Oh, no…

  • The Whites of Their Eyes

    Just waiting for a major lawsuit to occur against Tesla…it’s coming.

    • Auf Wiedersehen

      If nothing happened to GM for killing 250 people, yes they killed them, from a DECADE KNOWN defective ignition switch, then nothing will, or at least SHOULD, happen to Tesla. If I remember correctly when I test drove the Model S w/Autopilot, it states to only use on freeways, never small secondary roads. This accident would never have happened on a freeway.

  • emjayay

    This was obvious from the beginning. A system that alerts and vibrates the wheel or something if you drift out of a lane without a turn signal on or hits the brakes to stop you from running into a stopped car faster than you can is one thing. A car that totally did everything properly is another. The Tesla is in between: good enough to almost drive the car most of the time and allow the driver to stop paying attention, but not good enough for many situations. An obvious disaster waiting to happen.

    • Knotmyrealname

      Correction. Read the above. It does NOT allow you to stop paying attention.
      It, and I quote, “reminds the driver to “Always keep your hands on the wheel. Be prepared to take over at any time”.
      It’s a beta phase software that comes with a caveat.

      • Dennis James

        Beta software tested on the street with untrained customers behind the wheel ? Somebody should be jailed for this decision.

        • Knotmyrealname

          Untrained? You mean they’re not capable of taking back control of the car – similar to a cruise control situation? Being ‘trained’ in this case means being able to drive a car. Please, think before typing.

  • smartacus

    i have said this before but tesla has sadly developed Autopilot for regular drivers, not tesla drivers
    And no company should ever be permitted to use public roads as their experimental laboratory.

  • MarketAndChurch

    AI technology will give the car’s vision better perspective. Maybe that’s the great frontier of autonomous driving, is to have an additional “driver,” in the form of an AI, who will factor into its driving decisions things that sensors will always have difficulty making judgements on. Programming the curvature of road surfaces, and cars that share with each other the conditions of roads they’ve previously driven will also take a lot of this guessing out of the way, but an advanced AI is really the missing piece here.

  • Giorgos Papaspyros

    Tesla could be blamed as if having committed a likely premeditated murder. or should I call it more like a negligence case…
    It was all to be expected when they released an early state technology out in the streets.
    For those Tesla drivers who still use the Autopilot mode, it’s their own blame too.
    It is naive to believe such an AI feature is ready yet to work wisely and perfectly in all instances.

    Tesla has turned its consumers into unexperienced beta tester technical staff, working free hours for the company playing hard with their own lives, in what the ‘established’ car makers – or even Google – are still extremely cautious to release to the masses and only work in-house.
    The indications that the Tesla experiment would lead to fatal consequences had already appeared long ago. No one bothered with, neither Tesla nor NHTSA.

    If it would not be for the Tesla driver himself, it would certainly happen for another individual nearby.
    Tesla claims having warned that its Autopilot mode is only intended for use in highways, still Tesla hasn’t implemented a feature so as to lock its active mode only in motorways. And for sure, Tesla openly receives all the telemetry no matter if it comes under unrecommended use out of highways.

    Of course, NHTSA was silent in spite of all early indications.
    Whereas NHTSA is very dramatic when it comes to accelerator pedals, car emissions and ingnition switches.

    Well, there is no question.

    • Knotmyrealname

      Who is sitting in the driver’s seat of that vehicle? The driver. Who is facing forward in that vehicle? The driver. Who SHOULD be paying attention as to where he is going? The DRIVER.
      It clearly states that the technology is in a beta phase. Anyone who drifts off and does not pay attention to where they are going on the road is at fault for their own inability to follow standard road rules.
      I feel for the family of this person who it seems really loved their car, but basically that man put too much faith in the tech before it was signed off.
      I’m trying to consider an analogy for you, but it’d be like suing Microsoft for letting your computer crash whilst trialling beta software. You have to understand that the system is not complete.
      The poor driver in this circumstance is at fault, just as if he was driving without aid.

      • Giorgos Papaspyros

        OK now I see you are calling me several insults.
        Well you should be ashamed for responding with insults to my good faith.
        Ironically the insults can be returned.
        Nothing more nothing less.

        • Jan Mleziva

          However he is right. The system tells driver to remain alert and hold the wheel and only use on freeway.

        • Knotmyrealname

          George, I posted the above before we had our reasonable, level conversation. It seemed Carscoops kept deleting my posts, so the first one I saw was the one that was fragmented. Sorry for the insult. I enjoyed our banter.

    • Knotmyrealname

      Your alarmist comments are counterproductive.
      This driver in the driver’s seat, was facing forward and by the agreement with Tesla when trialling beta software had to be in control of the vehicle and ready to take over at any time.
      The fact is (sadly for him) he did not pay attention and allowed a fatal collision to occur.

    • Knotmyrealname

      I have tried to respond to you George, but I am being blocked somehow. Carscoops, why is someone allowed to say that someone has committed murder but someone isn’t allowed to defend a clear caveat provided by a company?

    • Knotmyrealname

      Te sla has not com mitt ed mu rd er

      • Giorgos Papaspyros

        I might give you a point to that.
        Tesla can be blamed of being an ACCOMPLICE to a killing due to the negligence of preventing it from happening.

        More killings to come in the future
        unless Tesla releases the next version of Autopilot fixing the flaws.
        And for heaven’s sake, please Tesla call it semi-autonomous
        there are morons who might take the words ‘autonomous driving’ for real.

        PS: I am not opposed to new technologies, on the contrary I quite like the whole idea.
        I am only saying it is irresponsible to release premature technologies to the masses.

        • Knotmyrealname

          No, it’s irresponsible to not follow the rules of playing the beta game.

          • Giorgos Papaspyros

            I wasn’t aware of it as being beta testing for consumers playing with people’s lives.
            Nor Tesla has taken all precautions.

            In every aspect of it,
            such fatal incidents could undermine public’s trust towards new technologies.

          • Knotmyrealname

            It’s software. It must go through a beta stage. It must be tested in real world scenarios. It must be therefore used with the correct precautionary approach.

          • Giorgos Papaspyros

            When you have to deal with real people, unexperienced people, consumers but more importantly with people’s lives,
            you have NO right and you must NOT launch an experiment in the open world.

            If not for people’s lives (supposedly one could ever care less about those),
            then for the sake of the technological progress, for the public trust to it
            and for the sustainability of your own business.

          • Knotmyrealname

            It’s not an experiment, it’s called development. These people consciously upgraded their firmware to take part in a (real world) development stage. They understood the risks. We cannot continually take the onus away from people and shift blame at our leisure. This is the litigious mindset that costs us all in the long run. It’s a weak, blameless attitude.

          • Giorgos Papaspyros

            It is not like in-house development,
            these are real people, amateurs, non-professionals, unexperienced, technically illiterate with no training or education, with no real time guidance or assistance from supervisors.

            It is not only irresponsible but also unethical from Tesla’s part.
            Tesla is benefiting from this experiment.
            Tesla is aware of all pros and cons.

            If ever Tesla needs to develop anything,
            they have to invest and put resources into it.
            No matter if Tesla gets amateurs’ consent,
            Tesla has to do it the professional way,
            just like the rest of the industry.

            I won’t be surprised if lawsuits come over.

          • Knotmyrealname

            Lawsuits will no doubt come, and unfortunately from people with mindsets like yours that desire to be blameless and lust after the fantasy of easy payouts. In this case, the claim will be thrown out of court.

          • Giorgos Papaspyros

            You wouldn’t be pleased to know what I think of your saying, ha ha.

            Leaving that aside I happen to know in my own business how the judiciary system works.
            Courts tend to give a ruling in favor of people likely misled to offer their consent.
            I won’t be surprised if the complaints are all about that, and the rulings of the courts do credit to those claims.
            After all, it was unprofessional from Tesla’s part to put people’s lives at risk.

          • Knotmyrealname

            My reply to you would at this point be repetitive and unfortunately not enough to get through.
            Let’s agree to disagree and let history decide.

          • Knotmyrealname

            Oh, and please feel free to edit your ‘premeditated murder’ claim. You have the power to do that.

          • Giorgos Papaspyros

            Do yo want me to call it a negligence?
            Negligence in full accountability for Tesla.

            OK I shall call it so, and take back the premeditated,
            in a sense that the autonomous driving needs a lot of engineering, programming, and testing,
            extreme computing power and AI, powerful sensors and reliable detectors, backup systems and vehicle adaptation,
            takiing into account real life scenarios and coordination with parameters like weather conditions, road infrastructures etc.

          • Knotmyrealname


          • gary4205

            These people [stupidly] trusted Tesla. BIG MISTAKE.
            It’s truly criminal of Tesla to introduce this unproven technology into the marketplace. We aren’t testing computer software or a new iPod! We are talking about a motor vehicle capable of speeds of over 150MPH being unleashed on an uninformed and trusting public. A public that actually believes a car can “drive itself” in an uncontrolled world, with millions of variables!

            This is ALL Tesla’s fault and I hope Musk and his cronies pay dearly for this. I hope the needless death of this civilian “developer” [i.e. CUSTOMER] is never forgotten!

          • Dennis James

            If the software must be tested, then it must be tested with trained engineers behind the wheel, not customers !

          • gary4205

            As it should! This technology is years, if not decades, away from being ready for prime time! In fact, it’s a technology that few are even asking for. That ALONE makes Tesla’s irresponsible push more egregious!

            It’s criminal to rush this totally unproven technology to market.

            There are limited uses for this technology, but it needs to be fully developed before customers are exposed and their live put in peril. This is very much Tesla’s fault. Their marketing hype has always far outpaced their actual product and ability to deliver.

            This is a very irresponsible company.

        • Auf Wiedersehen

          Again, GM killed 250 people for its ignition switch neglect that they KNEW about for DECADES and they are still in business, partly thanks to US, the taxpayer. If this burns Tesla, it will PROVE the complete corruption in law by corporations.

          • Giorgos Papaspyros

            In my opinion taxpayer money should never be used to bail out failed private businesses. This goes for GM.

            Whenever private businesses brake the law, they should pay the price, either it is VW or Tesla.

          • gary4205

            Actually, most of the deaths in the GM ignition cases were due to DRIVER ERROR. Alcohol and drugs were involved in many cases, and of course, failure to wear seat belts. Also lack of experience to know how to handle a vehicle that has lost power. A skill that every driver MUST possess.

            The media made it sound like losing power meant steering loss and a lack of brakes, which is a complete and total LIE. Modern hydraulic braking systems are designed to hold brake pressure, should the engine stall [or turn off] to MAKE SURE a competent driver can STOP the car safely. Even AFTER that pressure is used up, you can STILL stop the car, it’s just HARDER.

            As for steering, well …. hell, it’s not like it’s “disconnected”! You can still turn the car, it’s just HARDER.

            Had these drivers not been impaired by the use of alcohol or illicit drugs, they would likely be alive today. Especially if they had the good sense to have worn their seat belts!

  • Daigort

    this autopilot is rubbish, takes away the fun of driving a car, the joy behind a steering wheel controlling a car.

    • Knotmyrealname

      It’s amazing technology, and like it or not, it’s here. I will resist until the bitter end (I like driving MYSELF) but probably have it in later life when it may assist my diminishing abilities. By then it should be pretty well sorted.
      But it will never be 100%. Nothing can be.

    • lapirk

      100 years from now our children wouldn’t understand the joys of steering. It’s inevitable, we have to build the future

      • Kannag Don Amenra

        I would think that steering and combustion engines may exist like mechanical watches exist even after 200 years. A niche market indeed but much sought after.

  • bxniels0

    Is it possible this accident might have still happened if the driver had been driving the car himself?

    I know they’re only words, but on Twitter, some people were critical of calling it ‘Autopilot’ and that it gives a false impression of it’s capabilities. Maybe Tesla and their marketing department aka the media will now tone down their hype?

    • roy

      is the accident was imminent because firstly tesla model s was unable to differentiate between the semi and the bright background and this was the same case with the driver (at this point it is a hypothesis though) which means the accident was unavoidable. And as far as tesla bashing which most of the other people love, here’s something for them………. The technology in the model s or any other tesla for the matter is a SEMI-autonomous one

    • roy

      also it is my belief that for autonomous vehicles to function on rad all the vehicles must use that tech otherwise it will never be safe

      • gary4205

        EXACTLY! The only way these self-driving cars will ever work is in a closed circuit where ONLY self-drivers are allowed. Any variables, including pedestrians, and bicyclists, will effect the ability of these things to operate as designed. Even a stray cat or dog could cause a problem. Also, these damned things must be speed limited to 35-40 mph. This is not a technology that can be seen in “mixed use” or at highway speeds.

        I can see limited benefit from full on self-drive, but only in very limited cases.

        Folks that think this is a “cure-all” for human error in driving, need to remember these abominations are created by humans! The errors are built in!

        Damn shame someone had to die to wake people up to the lunacy and exaggerated hype that is Tesla!

  • R1S0

    “Raises Questions On Autonomous System”

    rather say raises question on common sense.

  • fabri99

    Condolences for those involved. I’ve always been skeptical about self-driving cars, we’ll see how this will end.

  • Accidents will happen, fatalities will happen, but softwares will make less and less mistakes compared to humans. Same thing with airplanes, most fatal accidents happen because of the pilots. Autopilot is making the roads safer, that’s the broad picture.

    • gary4205

      You are delusional. You can bet this will throw a LOT of cold water on this lunacy of “self-driving” cars on major roadways. This nonsense needs to be restricted to closed circuits [with nothing BUT “self-drivers” and no pedestrians, bicyclists, etc] with speeds limited to 35mph or less. This is NOT a technology that can be used in the general population or at highway speeds.

  • Rick Alexander

    I’m baffled by the amount of assumptions being made on this forum. Nobody, with the exception of the the Tesla driver (who unfortunately is now deceased), the truck driver, and anyone who may have witnessed the accident (as well as the onboard computer of the Tesla), know for sure what really happened. We know a truck and a Tesla had an accident. We know an individual is dead. We also know that the truck turned left in front of the Tesla. Is it possible that the truck was at fault? Is it possible that the truck made an incorrect left turn in front of the Tesla driver without driver or Autonomous system being able to react quick enough as a result? All questions that I’m sure the professionals are trying to determine as we speak. There is definitely merit in the statement that Tesla should have considered holding back on the auto-pilot feature until further testing had been done. With that said, parts fail all the time. Brakes, tires, steering, engine, etc. It’s a wonder we don’t have more deaths on our roads. Finger pointing will not bring back Joshua Brown. My deepest condolences to the Brown family for their loss. A.D. (Autonomous Driving) is here to stay, so get used to it. Unfortunately, the chance of such a situation repeating itself in the future, is quite likely also. I love driving, and while I see the benefits of A.D., I for one will continue to control the vessel I navigate. If I am not able, for some reason to do that, then I will relinquish control to an experienced individual who is. Let’s hope that the unfortunate tragedy of the Brown family serves to improve and perfect “A.D.”, and that car companies, like Tesla, take more stringent measures, before releasing new tech to the masses!