Despite what the name might suggest, Tesla’s controversial Autopilot tech offers driver assistance and not a fully autonomous capability. But apparently nobody told that to this guy.

The video, hotly debated in the online corridors of Reddit, shows a man named Eric Dogen in a new Model 3 relying heavily on the Autopilot.

The technology’s limitations are uncovered when it fails to recognize a traffic signal and runs straight through a red light. Rather than intervening by hitting the brake pedal, the driver, who obviously sees himself more as a passenger of a self-driving car, simply lets it go right through the intersection, blaming the vehicle rather than himself (or counting his lucky stars that he didn’t get hit by cross-traffic).

It’s not the first such error we’ve seen with semi-autonomous vehicles running red lights unimpeded by human intervention. Intel’s Israeli self-driving acquisition Mobileye recently drew a measure of unfortunate attention when a prototype Ford Fusion/Mondeo equipped with its sensors did the same while cruising the streets of Jerusalem during a press demonstration, with the cameras rolling. But that was a prototype in (or as the case may be, out of) the hands of the company’s own engineers. This, on the other hand, is a production vehicle being operated by a customer in real traffic.

So who do you think is to blame in this instance: the vehicle’s manufacturer, or its driver who posted the video? Whichever side of the debate you fall on, one thing’s for sure: this isn’t the last we’ll see of this kind of controversy surrounding semi- or fully-autonomous tech in the coming years.