Comment by efitz
A lot of people worry about a Terminator style AI apocalypse. I don’t.
I worry that we’ve already created the AI apocalypse and that this is what it looks like, along with extremist magnification on social media.
I trust AI to be what is is- essentially a lot of math that classisfies and predicts stuff, usually words, that the prediction can be used generatively, and the classification stuff can be used to identify stuff in various media.
What I don’t trust is that people will use it responsibly. Hell, I don’t, when I’m vibe coding, but that’s on me.
People are venal and self absorbed and busy and lazy and all the other traits that lead to not using AI responsibly. And businesses are amoral (not immoral) and want the shortest path to revenue, with the least friction.
So of course police officers who want to be on patrol and did not sign up to spend countless hours on reports, are going to let the AI write it and call it good without proofreading.
We could pass a lot of laws trying to specify products that force police to act reliably, or we could maybe just pass a law that says that AI cannot be used to write police reports, but that clearly labeled AI generated transcriptions and summaries may be attached unedited to police reports, if and only if the original recordings are also preserved as evidence.
And police departments that keep body camera and car camera footage might be ease up on the report writing and only require officers to annotate it with their impressions, but otherwise let the record speak for itself.