U.S. safety regulators have upgraded their investigation of Tesla’s advanced driver assistance system (ADAS), Autopilot. NHTSA has now opened an Engineering Analysis into the safety of the feature.

The agency first started down this track in August 2021, following several accidents involving Teslas and emergency vehicles stopped on the side of the road. At that time, it opened a Preliminary Evaluation (PE) into the system. During a PE, the administration takes information submitted by drivers, automakers, and other sources and uses it to determine whether a recall or an Engineering Analysis is required or if the case should simply be dropped.

It found, though, that “the crash review identified patterns in system performance and associated driver behavior across different sets of circumstances that enable the agency to identify areas of engineering inquiry that warrant an upgrade of this Preliminary Evaluation to an Engineering Analysis (EA),” according to NHTSA documents.

Read More: U.S. Safety Regulators Open Formal Investigation Into Tesla Autopilot After Crashes

In addition to the 16 Tesla crashes involving first responders that launched the investigation, NHTSA looked at another 191 collisions involving Autopilot. Of these, 85 were removed for involving external factors (the actions of other vehicles, for instance). In about half of the remaining 106 instances, it found that the driver’s operation of the system (responding too late to prompts from the vehicle or doing the wrong thing) was a primary factor in the accident.

Another quarter of the accidents involved drivers attempting to use Autopilot in an environment where, according to Tesla’s owner’s manual, system limitations might exist. In every accident in which detailed information was available, though, the driver’s hands were found to be on the wheel in the last seconds before the crash.

Similarly, in the 16 accidents with first responders, only two of the drivers were presented with driver engagement prompts within five minutes of the accident. That suggests, NHTSA says, that drivers largely aren’t deliberately misusing Autpilot or sleeping at the wheel.

“A driver’s use or misuse of vehicle components, or operation of a vehicle in an unintended manner does not necessarily preclude a system defect,” writes NHTSA. “For systems labeled as SAE Level 2 ADAS, important design considerations include the ways in which a driver may interact with the system or the foreseeable ranges of driver behavior, whether intended or unintended, while such a system is in operation.”

The regulator adds that finding effective ways to keep drivers engaged and allowing them to perform their supervisory driving task properly is an “important safety consideration.”

From here, NHTSA will look at more data, perform vehicle evaluations, and explore the “degree to which Autopilot and associated Tesla systems may exacerbate human factors to behavioral safety risks by undermining the effectiveness of the driver’s supervision.”

The Engineering Analysis (which NHTSA normally attempts to finish in a year) ends if either the regulator decides a safety concern does not exist or the automaker initiates a recall.