Police reports written with advanced tech could help cops but comes with host of challenges: expert

Axon's Draft One, an AI software, can analyze police officers' body-worn cameras and generate police reports. Fox News contributor Paul Mauro weighs in on the new technology.

Sep 24, 2024 - 08:00
Police reports written with advanced tech could help cops but comes with host of challenges: expert

Several police departments nationwide are debuting artificial intelligence that writes officers' incident reports for them, and although the software could cause issues in court, an expert says, the technology could be a boon for law enforcement.

Oklahoma City's police department was among the first to experiment with Draft One, an AI-powered software that analyzes police body-worn camera audio and radio transmissions to write police reports that can later be used to justify criminal charges and as evidence in court.

Since The Associated Press detailed the software and its use by the department in a late August article, the department told Fox News Digital that it has put the program on hold. 

"The use of the AI report writing has been put on hold, so we will pass on speaking about it at this time," Capt. Valerie Littlejohn wrote via email. "It was paused to work through all the details with the DA’s Office."

US TO HOST FIRST AI SAFETY NETWORK SUMMIT AS NATIONS SEEK ALIGNMENT ON POLICY

According to Politico, at least seven police departments nationwide are using Draft One, which was made by police technology company Axon to be used with its widely used body-worn cameras. 

Paul Mauro, a former NYPD inspector-turned attorney, said he "never met a cop who liked paperwork" and that each report takes at least a half hour to write, depending on the officer.

"You have to do multiple reports sometimes, a complaint report … the arrest report … then vouchers for the property you intake," he explained. "Then there could be other reports, request for narcotics analysis, intel reports, juvenile reports, etc." 

"That's why reports can be so onerous," he continued. "There's a report for everything." 

Depending on what department an officer works for, they could write anywhere between dozens and hundreds of reports per year, Mauro said.

MICROSOFT DEAL WOULD REOPEN PENNSYLVANIA NUCLEAR PLANT, SITE OF 1970 PARTIAL MELTDOWN, TO POWER AI

But with AI technology providing a template for those reports, officers may have more time for policing. However, it is important that they review those generated results for mistakes or "AI hallucinations" – the technical term for occasional incorrect or misleading results that AI software generates.

"If cops get lazy and are not in a position to adopt what's written in the report, even if they check the box at the bottom [indicating that they reviewed the report]. I absolutely could see in court situations where the cop says, ‘Look, I was really busy. I scanned the report and checked the box. I missed the fact that the report said we were looking for a Hispanic male when we arrested a White male,’" Mauro said. "But to put it in perspective, you get that with people, too."

However, he said, reports must be reviewed by the officer's supervisor in every police department, and that extra layer of surety isn't going away.

AI-written police reports, Mauro said, could make police reports more consistent and the software could likely be used for analysis that would take a person months to complete.

OPINION: AI SHOULDN'T DECIDE WHO DIES. IT'S NEITHER HUMAN NOR HUMANE

"When with drop-down menus and things like that, every police report has some free-form to it," Mauro said. "If you have a chatbot standardizing this, you're going to be able to use that same chatbot, because they use this kind of stuff now, to analyze patterns.

"The AI can look for commonalities. It may take a person six months or a year poring through mountains of data scattered disparately throughout the United States. [But] a chatbot crawling through the NCIC (National Crime Information Center) database that has access to reports from all these different states could identify commonalities between these different cases very quickly, save a lot of time and be accurate."

However, Mauro said, the software should initially be used for smaller property crimes and misdemeanors while departments work through the kinks and "throw their lawyers at it right from the start."

"You want to make sure it's implemented fairly, legally, et cetera, but you also need to look for the tactics that will be used to gum it up," Mauro said.

He compared the onset of AI technology in police work to the adoption of Axon's body-worn cameras, which are now a norm in policing.

"The bodycam program in New York City, when they put bodycams on the cops in New York, they really resisted it. The anti-police activists wanted it, so they did it. [But] it ends up supporting the police officer's version of events more than 90% of the time and has become something that the unions very much endorse," Mauro said.

Likewise, the standardization of automated reports could support officers.

Another potential issue is the "CSI Effect," where improvements in DNA analysis and the prevalence of police shows have made juries wary of convicting offenders without DNA evidence, even "on the most obvious crimes," Mauro said.

"I could see there being an AI effect, where if the AI doesn't fully support what you're saying despite the fact that, as a cop, what you're saying is accurate, especially in the realm of a false positive. You have this bias toward saying, 'Well, the AI doesn't get it wrong, it's a computer.'"

Through a survey, Politico also found that police departments have no way of differentiating AI-generated police reports from those written by human officers once they are entered into their systems. However, Axon told the outlet that it has access to this information and will share it with law enforcement upon request.

Axon could not immediately be reached for comment.