(TNS) – Connecticut prosecutors and police chiefs have put the brakes on law enforcement agencies’ rapid expansion of artificial intelligence-powered law enforcement tools until the emerging technology is tested and rules for its use are established.
At the top of the list of concerns about AI that led to the self-imposed delay is the increasing use of police agencies in states such as Connecticut, which are actively marketing software that creates police reports from audio recordings collected by body cameras worn by officers.
Supporters of the technology, many of whom work in law enforcement, predict it will improve police efficiency and, in turn, public safety by allowing officers to spend more time on patrol and less time at their desks writing reports that form the basis of prosecutions.
Some have expressed skepticism about applying evolving, arguably flawed, and still relatively untested AI technology to criminal justice with far-reaching societal implications. They point to highly publicized AI failures, including one in which a police officer’s body camera during a traffic stop in Utah recorded the movie “The Princess and the Frog” playing in the background, producing a report that said the officer “turned into a frog,” among other things.
Chief State’s Attorney Patrick Griffin, with support from the Connecticut Chiefs of Police Association and state police, suspended the use of AI programs “to draft, create, and/or narrate crime reports,” allowing users to test the software, identify flaws, and establish rules for its use.
“There is little doubt that this technology will lead to increased operational efficiencies for police departments and ultimately provide cost-saving benefits to our communities,” Griffin said. “Nonetheless, the use of AI must be implemented in a way that promotes public confidence in our nation’s criminal justice system. Before adopting policies regarding the use of AI or training police officers on this topic, it is important to fully understand both the benefits and drawbacks of using AI in law enforcement.”
The moratorium was imposed, at least in part, in response to concerns from hundreds of defense attorneys in the state Public Defender’s Office, many of whom question whether computer programs can accurately match impressions of police officers when portraying hectic, chaotic and emotionally charged crime scenes.
Earlier this year, the public defender’s office proposed a bill that would have sharply regulated the use of AI report-writing technology, but the bill died after a moratorium. Among other things, it would require police departments to clearly label AI-generated reports and require officers assigned to the report to review, sign, and certify each page for accuracy.
The proposed bill would also require law enforcement agencies to keep all drafts of AI-generated reports, allowing examiners to track corrections of computer-generated errors. Additionally, the proposal had language that would limit the ability of software developers to sell computer-generated police reports in Connecticut.
Senior state law enforcement officials have generally been reluctant to discuss AI technology, saying any plans will be put on hold pending the outcome of investigations conducted during the moratorium.
Groton Police Chief Louis J. Fusaro, president of the Connecticut Association of Chiefs of Police, said he believes fewer than five departments in Connecticut are “considering the use of AI to generate reports.”
One of those is the New Haven Police Department, which said in a written response to questions that it purchased a report-generating AI program called Draft One from Axon Enterprises, an Arizona law enforcement software marketing industry leader. New Haven Police Department spokesman Christian Brookhart said New Haven is testing the software on low-priority calls where there are no arrests.
“I haven’t personally used it, but in general, Draft One uses audio from an officer’s body camera to generate a draft that the officer can use as a template for their final report,” Brookhart said. “We’re not ready to tackle AI reporting yet. This technology is still very new to police enforcement.” [Connecticut] Several state-level mayors’ meetings will be held to discuss the issue, with input from state attorneys. ”
Others say Meriden police also purchased AI report generation software but did not respond to inquiries.
State police are an outlier among law enforcement agencies when it comes to actively discussing AI. The department has signed a 10-year, $120 million contract with Axon that gives it options to purchase an array of high-tech equipment, from drones to non-lethal Taser weapons that activate body cameras when pulled from their holsters.
The Draft One AI reporting software represents a relatively small portion of the state police contract.
Capt. Ryan Maynard, who reviewed the department’s various purchasing options, said Axon AI report writing software reduces or eliminates computer-generated inaccuracies by requiring or allowing staff input at multiple points in the report writing process. He said it generates reports from data limited to data collected by body camera audio recorders, preventing the kind of fictional “hallucinations” that sometimes appear in popular consumer AI programs that have access to unlimited data.
Earlier this month, the state Supreme Court heard a case in which commercially available AI legal briefing software produced appellate briefs containing “hallucinatory” non-existent citations.
“That’s the really important part,” Maynard said of the Axon reporting software. “It doesn’t fill in the gaps. If it wasn’t there, it wouldn’t be documented in the report.”
Before uploading completed reports to the state police evidence system, which was also developed by Axon and purchased by the state police under contract, officers must certify that they have reviewed the reports.
Axon is the largest provider of body-worn cameras to U.S. law enforcement, with 30% year-over-year growth, according to analysts. Company officials would not respond to requests for interviews.
As artificial intelligence permeates every aspect of life, it is believed that it is only a matter of time before it becomes part of the law enforcement toolkit.
“I think there is some degree of inevitability,” said Michael Lawler, former co-chair of the Congressional Judiciary Committee and now associate dean of the Henry Lee School of Criminal Justice and Forensic Sciences at the University of New Haven. “It’s being used in every aspect of commerce right now. So I think it would be foolish not to see how much it incorporates into criminal justice… It’s good to be cautious. But I can’t imagine anyone saying having an AI write something in a police report is completely unacceptable.”
As the state’s top prosecutor and head of the state’s Division of Criminal Justice, Griffin could have the final say on whether to adopt AI-generated police reports. And last week, he wasn’t persuaded.
“I would say my concerns are pretty specific,” he said. “An officer’s police report isn’t just based on what they hear. It’s based on what they heard, saw, and smelled. It can be based on their fears, their interpretations, and many other things.”
Reporting technology simply picks up sound and creates a report, but many people think it’s inaccurate and “potentially problematic.”
“Proponents of this type of technology will say that it improves or facilitates police efficiency because entering police reports takes time and takes officers off the road and away from their duties.
“My answer is, we’re not just looking for efficiency, we’re looking for accuracy. And under current technology, officers will effectively be swearing under oath what they believe the computer has heard.
“That’s my concern. I took it to the head of the Police Association and they agreed with me,” he said.
©2026 Hartford Courant, Distribution Tribune Content Agency LLC.
