Tesla Model X Driver Killed In California After Crashing Into Barrier With Autopilot Engaged

A Tesla Model X driver has been killed in California after the all-electric luxury SUV slammed into a barrier on U.S. Highway 101.

The crash and subsequent fire occurred earlier in the week, involved two other cars, and killed 38-year-old Walter Wong, an Apple engineer, who was driving the Tesla. The electric automaker has now confirmed that Autopilot was enabled at the time of the crash, Reuters reports.

In a blog post, Tesla said: “Autopilot was engaged with the adaptive cruise control follow-distance set to minimum.” In the seconds leading up to the fatal impact, numerous visual and an audible hands-on warning instructed the driver to put their hands back on the steering wheel.

“The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken,” the blog post continued.

Tesla hasn’t said why Autopilot failed to detect the concrete divider.

Authorities start investigation

Both the National Highway Traffic Safety Administration and the National Transportation Safety Board are investigating the fatal crash.

The first confirmed fatal crash to have occurred when Autopilot was introduced happened in mid-2016. Upon investigating that incident, the NTSB said that Tesla should have taken further steps to prevent the system’s misuse.

In the aforementioned blog post, Tesla admitted that Autopilot is not perfect, but asserts that it makes driving safer.

“Autopilot does not prevent all accidents – such a standard would be impossible – but it makes them much less likely to occur. It unequivocally makes the world safer for the vehicle occupants, pedestrians and cyclists.”

This tragic incident comes less than two weeks after a Volvo XC90 operated by Uber, that was on autonomous mode, killed a woman cyclist in Arizona, which led to the company, as well as Toyota and Nvidia, halting their autonomous testing programs.

Read: Experts run simulations, say Uber tech should have avoided the accident

Wong, who was transferred into a hospital but succumbed to his injuries, was a father of two. Our thoughts and deepest condolences are with his family…

  • An Existing Person

    I still believe the reason behind these accidents is Tesla marketing the system as “Autopilot” which leads anyone to believe it is capable of driving itself when it has many faults and is still a work in progress.

    • Honda NSX-R

      I agree

      • bloggin

        It’s ONLY those without a Tesla and don’t use Autopilot that would think the vehicle can drive itself. Owners must acknoledge that it is not self driving and they must maintain control of the vehicle when activating the system, along with visual and audiable warnings when hands are not on the steering wheel for defined periods of time. There is no confusion with actual owners, they just ‘choose’ to ignore the warnings, fall asleep, etc and suffer the consequences.

        Also, the article missed the fact that the safety crash barrer that was supposed to demnish the impact of hitting the barriar was missing/damaged from a previous accident, where it helped save the person’s life.

        • Sébastien

          They choose to ignore because they become too confident, which is the problem as the system is still far from perfect or full autonomous.
          So it’s still a marketing/communication/tech issue on the manufacturer, eg. that should beep the second you remove your hand.

          • Moveon Libtards

            GM’s Supercruise system doesnt have any of the Tesla probkems because it is much more intelligently designed and not over-hyped like Tesla’s faulty system.

          • Jay

            No they give warning which owners pay no attention to because the system does work really well most of the time.

          • Sébastien

            You’re just writing the same stuff I wrote in a different way…
            pay no attention = ignore
            work really well most of the time = works fine frequently

          • Jay

            You prefer I copy word for word or put it in my own words? It wasn’t on purpose but some say great minds think alike.

          • Sébastien

            LOL, misunderstanding then, your “No…” made me think you started a counter argument.
            Up/down votes are great too

          • Jay

            Well I said no because I didn’t fully agree with you. You blamed Tesla for the miscommunication, I blame the consumer for not having the common sense to babysit the beta technology. Tesla gives warnings.

          • Sébastien

            Warning are the cheapest/worse protective methods… Better avoid bad behavior with technical features (eg. No hand on wheel = No AP)

          • Jay

            That could work, then the car would have to safely pull over on its own. Until the driver takes over again.

          • alexxx

            Most of the time is not enough…

          • Jay

            I know. Many people here don’t care about putting everyone else at risk.

        • LeStori

          The less you have to do when “driving” your car, the more it is likely for you to loose attention and/or fall a sleep. How often have you seen you passendgers nod off when they are sitting there doing nothing. I suspect very often. It is irrelevant you have to acknowledge that you are still in control .That is soon forgotten as drousiness and in attention sets in.

          I use standard cruise control a lot. It reduces stress in a state where speed limit tolerances are 3 kph/1.6 mph. . A tolerance you cannot even determine accurately on your speedo. However it totally requires you to still drive your car with full attention. On one occasion I was traveling at 70 mph/110 kph on cruise control and a tyre carcass appeared on the road . I has a little time to determine if I could avoid it or hit it. I went one way. The car behind went the other, The car behind hit it. You have ot be aware to react. The driver of this Tesla would have been less aware. Hence when the car did something “wrong”, They would have spent valuable time assessing the situation before . Bam. That it folks.

        • Moveon Libtards

          Doesnt excuse the fact that Tesla has a MAJOR problem on its hands with regards to safety, bleeding millions everyday, stock dropping like a rock, and sales virtually nothing…And then a completely failed Model 3 launch.

          No amount of excuse-making or deflection will change the fact that Tesla is collapsing and their excuses (and excuse makers) are looking dumber by the day.

        • Smith

          The acknowledgement requirement is simply a USA avoidance of liability clause for Tesla, but the perception and the name of the system implies that a Tesla can autopilot itself when autopilot is engaged. See how that happens, perception vs. reality or rather perception IS reality!

      • Moveon Libtards



    • salamOOn

      so it seems its true. in murica you need stickers for everything and everybody.

      Do not dry your cat in microwave oven.
      Objects in mirror are closer than they appear.
      Do not make coffee in you motorhome while driving and stay in drivers seat.
      Autopilot is just a commercial term for our more advanced cruise control system. please stay alert while driving your deadly weapon.

      • Dredd2

        then it should be called in every terms as “cruise control system instead of AUTOPILOT bullsht.

    • LWOAP

      Pretty much this. People seem to think it can do everything itself.

      • Jay

        That’s the problem. It gives people a false sense of security. People will get comfortable thinking that the system will not fail.

    • Six_Tymes

      doubtful, although i agree with the name scheme is too soon. this guy was an engineer, he was no idiot. he knew the limitations of the system.

      poor guy.

    • Jay

      It’s really a shame. Computers seemingly make better drivers but also can see a crack in the road as a line and it overrides the fact that it was headed to an object.

  • HG504

    Tesla really arent the best at marketing the Autopilot. If its a work in progress why have it in the first place? Why not wait until as many glitches and bugs as possible are wiped out. Because its clear that this system isnt fully ready. Its a great system but I think it needs refining before rolling it out.

    • Jay

      It’s really the only way of testing. We are the testers unfortunately. The only way it can get real world perfect is by being out in the real world.

      • dawyer

        It’s easy to saying because some people thinking he is smart than others to haven’t been unfortunate the one because all others share and fair distribution the risk of accidents. Thinking about if you put this naive idea on Drug Test!

        • Jay

          I’m sorry what?

          • dawyer

            I’m sorry you sorry what what?

          • Jay

            I didn’t understand what you were trying to say..

          • Matt

            He must be using some sort of online translator…

      • dawyer

        Lucky you aren’t trying to persuade somebody and just gossip. Do you hear? a lot of laugh comes from a science lab by your irresponsible speaks.

        • Jay

          My job give us random drug tests but I still don’t understand what you’re trying to say.

    • smartacus

      Agreed, this software is very much still in beta
      and this death is proof positive.
      Are they trying to scare off
      people from the idea of
      autonomy altogether?

      • Jay

        How else do you suppose they beta test?

        • smartacus

          Beta testing software
          doesn’t result in death.

          • Jay

            Beta testing resulted in this guys death.

          • smartacus


          • Jay

            Thanks for agreeing.

          • smartacus



    “The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken,”

    That’s what bugs me. Why didn’t the driver do anything or take back control of the vehicle when the warning systems told him to?

    • Jason Miller

      Yep. Both of those items makes it 100% the drivers fault in my opinion.

    • Nordschleife

      Because the driver probably assumed that the system would correct itself. Tesla has put warning after warning from previous incidents regarding Autopilot in both their marketing and the car itself and sadly some people still think because of the name “Autopilot” means it’s can do everything on its own. I don’t blame the name nor the technology. I blame people for not seeing the signs and reacting. May he Rest In Peace.

  • nastinupe

    He was on his iPhone

  • Six_Tymes

    maybe he fell asleep?

  • Bill Nguyen

    Man… I would never turn on this self driving shit. Why drive a car and not drive it? Just take a damn taxi!

    • LWOAP

      Taxis are kind of losing out to Uber, Lyft, etc….

      • Bill Nguyen

        Yeah, I know my man. I drive Uber hehe. When I said taxi I meant taxis, ride-sharing, etc. Seems dumb to just turn on autopilot and go to sleep.

  • jaykit

    Ironically, the same can be said for TSLA stock.

  • Nick099

    Uh oh…better get the Kleenex for all the Tesla fanboys and girls.

    Let the whining and the excuses begin!

  • Moveon Libtards

    A little pain and suffering for only $150k.

    Thanks, Tesla!

  • Astonman

    The problem I see with Autopilot now is that if i’m not as involved in driving, i would fall asleep. I tend to do that on long trips around an hour or more where I’m the passenger. I’m wondering when the Tesla’s system tried to inform the driver and nothing happened because he was asleep.

  • LJ

    People barely pay attention when they’re driving. They’ll never pay attention when the car does the driving.

  • AnklaX

    Anyone who has idea how computers work or has played racing games for long enough knows that AI driver can’t be relied upon.

    • Jay

      One reason why I think they should give this crap up. Sure it will get better but only after all of these deaths.

      • Status

        I’m sure the exact same words were uttered when the first car accident fatalities occurred. If engineers can invent innumerable safety technologies found in cars today, then they can properly code the software for an autonomous car.

        • Jay

          Wont happen. All tech is babysat by humans.

  • Vassilis

    Tesla needs to start accepting responsibility for what they’ve done with Autopilot. They have released an autonomous driving system, named it inappropriately and slammed a bunch of disclaimers and conditions for it to work correctly.

  • Jay

    Accident isn’t funny but that made me lol.

  • Beelzebubba

    Very easy to doze off when the car is doing the thinking for you. No question in my mind that that’s what happened here. Five seconds just wasn’t enough time for the driver to wake up and assess the situation. These systems should not be marketed without some sort of nodding-off senor installed as well. You can monitor the driver’s eye movements pretty accurately with a camera and computer.

  • Smith

    Nice of Tesla to blame the driver for the fact that the tesla Autopilot did not detect the concrete barrier and stop the car. That seems to be the drivers fault … for trusting that the piece of sh1t Tesla would save his life or at least try to avoid the barrier. Stupid driver, right Tesla? Yeah right!

  • alexxx

    Very much…

  • Jay

    That’s true. So why would it wouldn’t make sense for us to use it with out keeping an eye out and staying alert. Even Elon Musk knows that.

Tesla Model 3 Performance Is The M3 Rival BMW Never Anticipated

On paper, the Model 3 Performance destroys the M3 in a straight line and costs the same money.

Woman Awarded $37 Million After Odyssey Crash Left Her A Quadriplegic, Honda Appeals

Honda says it will “vigorously” appeal the decision as the Odyssey complies with federal safety standards.

2019 Geneva Motor Show A-To-Z Of All New Cars And Concepts (Updated)

A round up of all the production and concept cars debuting at the Geneva Motor Show.

McLaren Will Launch 18 New Cars By 2025, Including P1 Successor

An electric hypercar is also in development, McLaren’s chief of design operations said.

Did You Know Neiman Marcus Offered A London Taxi Burberry Edition In The States?

The LHD London Taxi was sold in 2002 and features a stylish Burberry leather interior.

Nissan Altima Recalled Over Rear Doors That Could Open Unexpectedly… Again

Lowering the rear window could open the rear door in certain 2015-2017 Altima sedans.

Powerlifter Lifts Jeep And Saves Trapped Man After Michigan Crash

Ryan Belcher usually lifts weights but on Valentine’s Day, he lifted a Jeep.

EU Automotive Association Warns Against Trump’s Proposed Car Tariffs

President Trump has 90 days to decide if he wants to enforce tariffs on imported cars or not.

The McLaren Of Eyewear Is Coming This Spring Courtesy Of L’Amy

The glasses feature 3D-printed titanium frames and distinctive ‘floating’ lenses.

2020 Nissan Juke Will Remain Funky Despite Switch To New Platform

The CMF-B architecture will enable the new Juke to offer a hybrid version and a roomier interior.