- The Department of Transportation plans to use AI to write rules.
- Google Gemini could draft regulations in just a few minutes.
- Human-led rules take months or years, raising safety concerns.
Artificial intelligence is far from perfect, so alarm bells started ringing as soon as reports surfaced that the Trump administration was planning to use AI to write federal regulations. That concern appears to be warranted as a government lawyer reportedly argued they don’t need perfect rules, just ones that are “good enough.”
ProPublica broke the story and reports the Department of Transportation is looking to use artificial intelligence to “revolutionize the way we draft rulemakings.” The use of the technology was promoted as a huge benefit, which would enable government employees to do their job better and faster.
More: AI Won’t Just Replace Truckers, It Might Get Them Fired First
While artificial intelligence can do a lot of things, the publication reported the DOT’s general counsel, Gregory Zerzan, seemed to care more about quantity than quality. According to the publication, he said “We don’t need the perfect rule on XYZ. We don’t even need a very good rule on XYZ. We want good enough. We’re flooding the zone.”
That sounds like the government is planning a ton of AI slop, but with government regulations. This seems like a terrible idea, especially for a federal agency that is heavily involved in safety.
A Gemini-created draft rulemaking on hover cars
However, the chief concern appears to be speed as the publication noted that writing and revising federal regulations can take months or sometimes even years. In contrast, a version of Google’s Gemini can reportedly reduce that time to seconds or a matter of minutes.
Using the technology, Zerzan reportedly said “it shouldn’t take you more than 20 minutes to get a draft rule out of Gemini.” This is said to be part of a larger effort to take proposals from an idea to a “complete draft ready for review by the Office of Information and Regulatory Affairs in just 30 days.”
The publication also mentioned a presentation where it was suggested that Gemini can be responsible for writing roughly 80% to 90% of regulations. Humans would do the remaining part and, presumably, be forced to check for hallucinations and errors.
Needless to say, some employees and former officials have concerns about the use of AI. Key among them was Mike Horton, who told the publication that using Gemini to write regulations is like “having a high school intern that’s doing your rulemaking.”
The whole story is worth a read, but only time will tell if using AI to create regulations is a good idea or a bad one. In the end, it’ll likely come down to how involved people are in the process to ensure rules make sense, ensure safety, and follow established practices. That being said, aiming for “good enough” doesn’t sound very reassuring.

