AI pattern matching is a black box. You can’t know how a decision is made until it’s too late to unmake it. A private surveillance firm plugging AI into policing doesn’t democratize safety or create objectivity, it accelerates suspicion based on existing grievances.
Except when it’s designed to suspect nothing. Flock’s response to controversies about privacy has included supposed “transparency” features, as well as tools that it claims will enable “public audits” of searches and results. And if your small police department that’s turned to Flock as a “force multiplier” doesn’t have the staff to run audits? No worries: “To support agencies with limited resources for audit monitoring, we are developing a new AI-based tool.… This tool will help agencies maintain transparency and accountability at scale.” Using an AI to monitor an AI is a level of absurdity Philip K. Dick never quite got to. Maybe someone can write a ChatGPT prompt for a novel in his style.
I think Dick would recognize another irony: AIs surveilling AIs surveilling us sounds like a dispassionate threat from without, but the ghost in the machine is that we cannot scrub away the passions and resentments that incite the obsession to begin with. The paternalism that launches the drone for our good doesn’t curb the risk that something will go wrong. When you use sophisticated technology to pursue vengeance, you are not elevating the action to a cause. Involving an AI doesn’t make violence an abstraction. An automated vigilante isn’t impersonal, just efficient.
You’re tax dollars pay for them, the data they collect should be public. When they are given the choice of public availability of the data or shutting it down they seem to opt to shut it down. This is what seems to be happening in Washington state after courts ruled in favour of public access.
Police get caught running searches on their ex girlfriends. Police are barely better than gangs and they’re using sophisticated tech to fuck with their local communities. Brilliant. Add the CEO of Flock to the list of persona non grata.
Beyond controversy around the Texas self-managed abortion case, Flock has had to respond to evidence that local law enforcement agencies have used their data to assist Immigration and Customs Enforcement. It now has offered assurances that jurisdictions proactively banning data sharing related to immigration status or abortion seeking will be excluded from national searches, as long as the local yahoo with tactical undershorts is dumb enough to put “ICE” or “abortion” in the required reason field.
But it turns out that once you’ve built a massive distributed surveillance network, it’s hard to rein in its use. The state of Washington explicitly bans sharing data or equipment with federal officers for the purpose of immigration enforcement, yet the University of Washington found dozens of examples of exactly that. Some local departments explicitly opened up their Flock data to the feds despite the state law; others had their information siphoned off without their knowledge via an unspecified technological error.
The university study and an investigation by 404 Media found another category of information sharing that also subverted state attempts to fend off immigration overreach: federal officers just asking really nice if the local guy could run a search on their behalf and the local guy happened to use “ICE” or “ICE warrant” or “illegal immigration” in the local search (tactical undies recognizes tactical undies, you know?). Worth noting: A local officer well informed about jurisdictional data-sharing limitations would just not enter “ICE” as the reason for the search, and we have no idea how many of those cannier cops there are.
We have this built in safety net that makes every user list the reason they accessed the data.
Reason for search: Not ICE
Checks out.
Already terrified? It gets worse: Flock is turning over more and more of its monitoring to AI, a feature that Flock (and the entire technology-media industrial complex) sells as a neutral efficiency. But the problem with AI is how deeply human it really is—trained on biased data, it can only replicate and amplify what it already knows. Misogyny and white supremacy are built into surveillance DNA, and using it to search for women seeking abortions or any other suspected “criminal” can only make the echo chamber more intense.
This month, an AI-powered security system (not Flock, surprisingly) tossed out an alarm to a school resource officer, and he called the police to the scene of a Black teenager eating chips. The teen described “eight cop cars that came pulling up to us [and] they started walking toward me with guns.” You can fault the resource officer for not clocking the chip bag; at least we know the point of failure.
Using an AI to monitor an AI is a level of absurdity Philip K. Dick never quite got to.
Dr Suess did though!
The thing that we need is a bee-watcher-watcher!". Well, the bee-watcher-watcher watched the bee-watcher. He didn’t watch well so another Hawtch-Hawtcher had to come in as a watch-watcher-watcher!
This is just a normal “Surveillance Capitalism”. ALL corporations feed on our data and they ALL collect/sell and infringe on our privacy, ALL the time - nothing new here…



