Tesla Sued Over Fatal Model X Crash Involving Autopilot

Tesla wants to make driving a thing of the past, but it’s Autopilot system isn’t foolproof.

The company is once again being sued and this time the lawsuit comes from the family of Walter Huang who was killed when his Tesla Model X crashed in Mountain View, California on March 23rd, 2018.

According to the lawsuit, as the Model X approached a “paved gore area dividing the main travel lanes of US-101 from the SH-85 exit ramp, the autopilot feature of the Tesla turned the vehicle left, out of the designated travel lane, and drove it straight into a concrete highway median.”

Huang was pronounced dead several hours later and the lawsuit alleges Tesla was “negligent and careless in failing and omitting to provide adequate instructions and warnings to protect against injuries occurring as a result of vehicle malfunction and the absence of an effective automatic emergency braking system.”

In essence, the lawsuit alleges that Tesla overstated the benefits of its Autopilot technology and Huang believed the crossover was effectively crash proof. The suit also claims the Model X was defective and should have been equipped with systems that would have prevented the accident.

The National Transportation Safety Board looked into the crash and released its preliminary report on the accident last summer. According to the report, the Model X traveled through the gore area until it struck a crash attenuator – which was mounted on the end of a concrete barrier – at a speed of approximately 71 mph (114 km/h). After the impact, the crossover rotated counterclockwise and the front portion of the vehicle separated from the rest of the Model X.

Before the crash, the Model X provided two visual warnings and one audio alert for the driver to place their hands on the steering wheel. In the final minute before the collision, hands were detected on the steering wheel for just 34 seconds. In the final six seconds, no hands were detected on the wheel.

While Huang should have been paying attention and keeping his hands on the steering wheel, the NTSB suggested the Autopilot system did contribute to the crash. Eight seconds prior to impact, the Model X was following a lead vehicle at around 65 mph (104 km/h). One second later, the crossover “began a left steering movement” while still following the lead vehicle.

Four seconds before the crash, the Model X stopped following the lead vehicle and this caused the crossover to speed up as the cruise control system was set at 75 mph (120 km/h). The Model X then hit the barrier and there were no braking or evasive steering movements detected before the collision.

The lawsuit also includes the state of California over its failure to repair the crash attenuator on the concrete barrier. It was damaged by an accident involving a Toyota Prius on March 12th and the lawsuit says the failure to fix or replace the attenuator contributed to Huang’s death.

more photos...
  • HaltestelleLuitpolthafen

    Tragic that this happened.

    The thing is, Autopilot warns you to pay attention at all times and this guy clearly wasn’t paying attention. Even “autopilot” on a plane requires someone to be alert and present in case something is about to go wrong.

    You can blame this on Tesla having a misleading name for their system, but it doesn’t change the fact that ultimately it comes down to you being responsible and in control of your vehicle.

    • Matteo Tommasi

      The problem here is that the “autopilot” turned into a wall, and this MUST not happen.

      • Jason Miller

        I doubt it “turned into the wall”. Likely was confused by that ramp there on the left. Either due to missing or incomplete lines on the road. Doesn’t change the fact that the guy was warned multiple times to put his hands on the wheel and he refused to comply.

    • Mike anonymous

      It is also the responsibility of the company to inform the user of the system they are using and its’ limitations properly. By inform I do not mean simply ‘tell the person’ but rather actually make sure they understand what this vehicle can and can-not do.

  • robotlogic

    California and their failure to properly maintain their roads are ultimately to blame.
    Hitting a failed crash attenuator is like driving into a metal wall.

  • Jason Miller

    “Before the crash, the Model X provided two visual warnings and one audio alert for the driver to place their hands on the steering wheel. In the final minute before the collision, hands were detected on the steering wheel for just 34 seconds. In the final six seconds, no hands were detected on the wheel.”

    Case closed.

    • TheBelltower

      Not necessarily. The statement is misleading. Teslas don’t have sensors in the steering wheel that know that hands are in place or not. They utilize a torque sensor that determines if there’s resistance placed on the steering wheel. His hands could have been on the steering wheel, but not providing any resistance to it for six seconds. Regardless, it’s pretty obvious that his hands weren’t likely on the wheel and he didn’t attempt to steer the car back on course. In enough time anyway.

      What I find interesting though, is that Huang had previously reported to Tesla that AP was error-prone on this stretch of the freeway. If he did report a potential flaw in the behavior of AP, then it would suggest that Huang was more knowledgable about the capabilities of AP than the claim that “Huang believed the crossover was effectively crash proof.” If he knew that his vehicle misbehaved at this area of the road, why would he have utilized AP at that location?

  • Mr. EP9

    Again, it is sad but the driver should have heeded the warnings and regained control of the vehicle. The “autopilot” system is not crash proof and I really think some people out there are not aware of the limitations of the system and put far too much faith in it as well.

  • An Existing Person

    I’ve said this countless times but the problem is in the marketing. Tesla’s “Autopilot” is nothing more than an advanced cruise control no better than what Mercedes, Infiniti, etc. offer. It clearly cannot drive itself and there is countless evidence that supports that. Not to say that their below average intelligency owners are to blame, but they wouldn’t avoid all the warning systems and have such confidence if the marketing of the cruise control system itself wasn’t so flawed.

  • Ary Wisesa

    That’s why I don’t fully believe in such system. I still believe in human capabilities. What makes me very afraid though, is the others who believe blindly in it and lower their caution, and instead of making the road safer, they make it much more dangerous for the others. Just look at the other victims of this accident. Not only the Tesla killed its driver, other people in Audi and Mazda were hurt too.

  • Jason Clairmonte

    I urge that we maintain perspective and to look objectively at the statistics and not just subjectively at what is a tragic incident. Literally MILLIONS of miles have been driven on autopilot safely. So the fact is that the Autopilot system has already proven itself safer than human driving because statistically, the number of incidents that happen on autopilot is less than what happens when humans are in control. This is a progression of technology that began in 2015. Expect that the proficiency of the system will continue to improve exponentially. Your ‘fears’ about the system are unjustified. These sorts of one-off reports paint a misleading picture that there is a bigger problem with these systems. There isn’t. This is the solution to traffic incidents. And it’s not the future. It’s already better than human drivers and will continue to improve beyond this stage.

    • This is what unfortunately the masses don’t understand. The media picks a case that’s related to automated AI systems and everyone goes berserk (note this very thread), while the daily tens of thousands of people that die on the roads are simply left ignored. I wonder who would like to fly without an autopoilot system on board – I know I would be scared to death as fatal aviation crashes are mostly related to human error.

      • Jason Clairmonte

        I couldn’t have said it better.

      • Mike anonymous

        The reason “Automated AI systems” in aircrafts have the ability to work is due to NOT ONLY a wide array of higher powered sensors then the ‘lite versions’ seen on most vehicles today, but also due to the fact that aircrafts are also managed and are provided with information on where EVERY OTHER AIRCRAFT in the sky is.

        There are regulations the aircraft manufactures and carries/airlines, MUST meet. AutoPilot is able to work thanks to a system run by most humans on the ground that creates and interlinking system which allows (again) for every aircraft to not only know where other aircrafts are, but to also know the vicinity in which they travel known as Flight Paths.

        The Skies would be a lot less safe if other objects such as drones (quadcopters), flocks of birds, toy airplanes, (In the case of the automotive industry; automobiles), and anything you could think of not hooked up to this co-human & AI run network, were flying around. There is a reason planes need to reach a certain cruising altitude (as it is easier for an AI limited by the processors it has to manage and interact with x-number of systems, as it provides the AI with less variables to worry about, especially concerning it knows where all other aircrafts are and will be.). Following distance, Cruising Altitude, and others are all factors of a proper functioning AI-Run, autonomous aircraft computer program.

        I have seen you around here buddy @burjkhalifa:disqus and you seem to have actually good comments and thoughts on aspects of the automotive industry (and you do have a valid point), but I would refrain from comparing the current generation localized AI systems running autonomous vehicle programs, to the far most advanced Human/Co-AI semi-autonomous network of aircrafts. While the two features similar technology they do not operate in the same fashion nor are they run in the same fashion.

        Comparing it to cruise control mixed with lane-keep-assist, computer programs run/managed by an AI would be more accurate, As autonomous driving system such as (in my personal opinion misleading name of) Autopilot by Tesla do no have access to the lack of outside variables nor the same level of a ‘connected network’ that those within the aircraft industry use.

    • Some common sense at last. Well said.

  • Matthijs

    Not the car but the driver is always responsible for his driving behaviour. You are not a passenger when you drive a car with Autopilot. But of course in the US it’s always about suing a company..

  • “…Huang believed the crossover was effectively crash proof…” Honestly, did they really think that?

  • Stephen G

    “Safety systems” that are only so called safety systems should not be allowed on vehicles. Such as airbags and anti-lock brake systems. All these things create inattentive drivers and add unnecessary expenses and have done little to curb death rates. If you don’t want to “drive” your car (properly) take a bus.

  • Jason Clairmonte

    So let me get this straight. You’re saying that an algorithmically based, machine learning program, assimilating the complete suite of driving habits of hundreds of thousands of humans, 24 hours a day, 365 days a year over millions of miles, in every conceivable situation, with the mechanically implemented, inherent ability to see in every direction and beyond immediately forward obstacles, that has the capacity to literally make millions of calculations per second, will never be better than a human driver with 2 eyes, 2 hands, 1 foot, 2.5 kids, a dog, a spouse, a mortgage, a stressful job and one brain?

    • Mike anonymous

      Hello @jason_clairmonte:disqus, It seems as though maybe you should read back through the post (if you didn’t do so already). Maybe I did not do such a great job in detailing about AI itself and how exactly a system is coded, implementation, capacity, and the capabilities and/or restrictions of said capabilities, (etc…), But I do believe that it is more likely you may not have understood fully the information provided in the post above (which is OK, there’s nothing wrong with that). Hopefully what I’ve listed below will better shed light on your question.

      In regards to your statement, based on the information provided (by someone (myself) who does work in relation to AI and the programs overseen by AI),.. I would say that yes; “It will never be better than a human driver” or as I stated;

      While it has the capacity to learn, it does not have the ability to do so, and It can not learn the information it is not provided with.

      With AI you have to teach it to understand and speak a language, teach it to understand a command, teach it to drive, teach it to watch out for pedestrians, teach it to make a decision (yes, you have to code-into AI the fact that it needs to make a decision, remember, ‘blank slate’.) etc

      If you’ll read through the information provided in the post above, hopefully it will make more sense, but your statement overlooks a great many limitations of;

      CAPABILITY. An AI; can not ‘see’ better than ‘eyes’ it was given, it can not ‘think’ or ‘process information’ beyond or any faster than the ‘brain’ (processor) it was given, It can not control anything beyond the ‘body’ (that which it is connected to, be it a computer device, or a CAR) it has.

      … Lastly regarding the final aspect of your statement (which I will admit gave me a bit of a chuckle with the last part lol, what happened to the other 0.5% of the 3rd kid?… maybe your referring to a small child.), I believe that your focus as a driver should be on the road not; 2.5 (lol) kids, a dog, a spouse, a mortgage, a stressful job.
      Your brain is far more capable than any computer processor (especially than the processors used for AI), I think your focus should be on the road so you can better protect your; stressful job, mortgage, spouse, dog, and 2.5 kids ?

      • Jason Clairmonte

        I have to admit. I am floored. I have absolutely no experience with AI. So I have to defer to your expertise. But I am dumbfounded as how to accept your explanation. Don’t get me wrong. I think i understand the core premise of your argument: that AI is inherently limited by the tools and language given it, which are in turn inherently limited by the humans developing such. However, I would have thought that 1. The synergistic aspect of the algorithm, technology and computational capability would at least provide faster reaction times and a higher likelihood of the ‘correct’ decisions being made, and that 2. You who worked in the industry space would be convinced that the tech being implemented in the industry now would continue to evolve and improve beyond human capability. I would have thought that, together these two things would imply an eventual superseding of humanity in terms of proficiency in this area.

        So, as a follow up, do you think that, in future, AI would be able to be more proficient and safer at total operation of aircraft and airtraffic control than humans, or do you think the same will hold in that arena as well? That humans will not be surpassed. Feel free to continue this discussion with me directly at [email protected]. Don’t think we should continue to hog this forum. And I really need to know. (The 2.5 kids statistic was what I understood to be the average number of kids per household in the US, so I just used that facetiously.)

        • Mike anonymous

          Sure , I will give you a ‘hoot’ over email.

          I generally do not work directly with coding (as I am personally more passionate about design (which is something you may notice if you see me/my-posts around this site)).
          Although I generally do not work with coding, I do have to understand it for; hardware and software design (ascetics as well as functionality) and engineering (pretty much all areas and aspects relating to design across many different industries (tech, cars, etc), amongst other things). I have worked in relation to AI systems in the past, but it is not what I generally do.

          Although please allow me at least 7 days to reach out to you regarding your question, I have some personal thoughts on it I think may interest you, but I would prefer to be fully focused when writing an email. I have a bit of work (design and personal things) to do this week so getting a message out to you is on my list of things to do ? . I will have something out to you soon.

Dartz Prombron Black Stallion: A Hummer H2 With An Attitude – And A $330k Price Tag

This Black Stallion built for a movie uses a standard Hummer V8, but customer cars can be fitted with a Hellcat one.

BMW M2 CS Is A 444 HP Tire-Shredding Machine, As You Can See In The Official Launch Film

BMW has just released the first video of the M2 CS and, happily, it displays lots of sideways action.

2019 LA Auto Show Preview: A-To-Z Guide To All The Debuts

Here’s a full list of all the world and North American vehicle premieres happening at the 2019 LA Auto Show .

ABT Puts More Meat On The 2020 Audi S6 Avant

The power upgrade is also available for Europe’s Audi S6 Sedan and S7 Sportback.

New 2020 Skoda Octavia Is More Affordable Audi A4 Than Golf Sedan

The new generation Skoda Octavia brings a more premium cabin, improved tech and new powertrains.

Watch The Live Unveiling Of The 2020 Skoda Octavia Here At 1 PM EST / 7 PM CET

The fourth generation Skoda Octavia goes official. You can find the live video here.

2020 BMW 4-Series Coupe’s Interior Spied Fully Undisguised

BMW’s new-gen 4-Series lineup has a cabin layout identical to the G20 3-Series.

Tesla Model 3 Driver Praises Car’s Stability After Getting Sideswiped

This Tesla Model 3 was sideswiped by an allegedly drunk driver in Deerfield Beach, Florida.

Honda Might Replace Your New Pilot Or Passport Over Bad Body Welding

Honda will recall and replace 9 2019-2020 Pilot models and one 2019 Passport .

Volkswagen Sets Prices For America’s Redesigned 2020 Passat

The 2020 VW Passat starts from just under $23,000 and comes with a standard 174hp 2.0-liter turbo four.