4th amendment5th amendmentaicellebriteFeaturedhallucinationsphone searches

Cellebrite Dumps AI Into Its Cell Phone-Scraping Tool So Cops Can Hallucinate Evidence

from the why-does-anyone-need-this? dept

I honestly don’t understand this compunction to break things that are already working fine. Axon makes body cameras (and Tasers!), but it simply wasn’t enough to equip cops with cameras and cop shops with expensive service contracts. No, the company insisted the way forward was dropping AI into existing tech so robots could start doing the boring cop paperwork.

Cellebrite makes tools that crack seized phones and scrape everything out of them for perusal by investigators and prosecutors. But just being able to do that wasn’t good enough. Now, as Joseph Cox reports for the always-essential 404 Media, Cellebrite is going to make its existing product more chaotic by adding “intelligence” more often known for what it gets wrong, rather than what it gets right.

Cellebrite has introduced AI specifically into Guardian, which is a software-as-a-service “evidence management solution,” the company says. In practical terms, Guardian is a piece of software for analyzing evidence already in a police officer’s possession.

According to Cellebrite’s February 6 announcement, the company’s generative AI capabilities can summarize chat threads “to help prioritize which threads may be most relevant,” contextualize someone’s browsing history to show what was searched for, and build “relationship insight.”

Well, that’s no good. The first problem is that AI isn’t exactly great at contextualizing data or conversations. And other problems can develop based on prompts given by investigators. At some point, someone’s going to get hallucinated right into a lengthy prison sentence they haven’t earned. It’s not a matter of “if.” It’s a matter of when.

But the most immediate problem is this: cops are already looking at this AI as a way to sniff out criminal activity they’re not even investigating. The press release from Cellebrite contains a quote from a police official who heads a force overseeing a town with [squints at Wikipedia page] 1,365 residents.

“It is impossible to calculate the hours it would have taken to link a series of porch package thefts to an international organized crime ring,” said Detective Sergeant Aaron Osman with Susquehanna Township, Pennsylvania Police Department, who recently piloted the solution. “The GenAI capabilities within Guardian helped us translate and summarize the chats between suspects, which gave us immediate insights into the large criminal network we were dealing with.”

It is impossible to calculate. It’s also apparently impossible to report. There doesn’t seem to be any information on this major break (in what initially appeared to be a minor case) contained anywhere on the PD’s press page, much less anywhere else on the internet. I’m not saying the detective is lying, because lying is usually done to serve the person doing the lying, rather than a third-party that probably assumes no one but other PR people are reading their press releases, but I can’t find anything that supports this assertion.

If anything, Sergeant Osman is probably overstating the results of the GenAI-assisted phone search, if only because he’s flattered someone from Cellebrite thought a small Pennsylvania town would be the best place to do a trial run of its new tech.

The largest problem isn’t the AI itself, though. It’s what the AI does, which has the potential to generate constitutional collateral damage. Performing a targeted search via human interaction is one thing. Allowing software to just go blundering around in the scraped contents of a seized phone is quite another, as ACLU lawyer Jennifer Granick stated to 404 Media:

“The Fourth Amendment does not permit law enforcement to rummage through data, but only to review information for which there is probable cause. To use an example from the press release, if you have some porch robberies, but no reason to suspect that they are part of a criminal ring, you are not allowed to fish through the data on a hunch, in the hopes of finding something, or ‘just in case.’

That’s a problem courts will need to confront. Chances are, that won’t be any time soon. There’s almost zero chance magistrate judges are being informed AI will be used to search seized phones when cops request search warrants. And there’s even less chance defendants will be informed the search of their phone was half-algorithm, especially when doing so might give defendants the ability to challenge the evidence being used against them.

When it does finally bubble to the judicial surface, will courts consider AI-assisted searches just another version of “inevitable discovery?” Or will they see this for what it is: something clearly not predicted by the creators of the Fourth Amendment, nor something that’s covered by current court precedent.

Filed Under: , , , ,

Companies: cellebrite

Source link

Related Posts

1 of 48