And It Doesn’t Work For Them
You didn’t consent to being surveilled. But your HOA did, on your behalf, at a meeting you weren’t at, for a product they didn’t fully read the terms of. It doesn’t arrive in a black van or flash a government badge. It arrives as a small, unassuming box bolted to a lamp post at the end of your street, put there by your local Homeowners Association, who thought it might help with the car break-ins.
That box belongs to a company called Flock Safety.
And here’s the thing nobody told your HOA when they signed up: that camera, which your neighbours effectively paid for with their monthly fees, is now part of one of the largest private surveillance networks in American history. It feeds into a national database. It shares your movements with law enforcement agencies across state lines. It may, depending on which version of the hardware your neighbourhood got, be listening.
I’ve been sitting with this story for a while, trying to decide how angry I’m actually allowed to be before it tips into paranoia. The conclusion I’ve reached is: quite angry. Justifiably so. With receipts.
The Idea Isn’t Bad. The Architecture Is.
Let me be fair to Flock Safety for precisely one paragraph, because I think intellectual honesty demands it before I start taking the scaffolding apart.
The original pitch is not sinister. Automated licence plate readers that help recover stolen vehicles and solve property crimes. Small towns with limited police resources getting access to tools that actually work. Communities feeling a little safer. None of that is inherently dystopian.
The problem isn’t the idea. The problem is what happens when you build a “give-to-get” network… where sharing your local camera data with the national system earns you the right to query cameras in every other participating city across the country… and then you discover that nobody at the local level actually controls who can walk through the door.
That’s the architecture of Flock Safety. The company now operates in over 5,000 communities, scanning roughly 20 billion licence plates every single month. It’s not a startup doing good-faith experiments with community policing. It hit $300 million in annual recurring revenue in 2025… a 70% jump from the year before… and raised $275 million at a $7.5 billion valuation, led by Andreessen Horowitz.
This is surveillance infrastructure dressed up as neighbourhood watch. And the gap between the pitch and the reality is where things get genuinely alarming.
What the Audit Logs Actually Show
Here’s where I stop being polite about it.
When the Electronic Frontier Foundation obtained datasets representing more than 12 million searches logged by more than 3,900 agencies between December 2024 and October 2025, the patterns were unmistakable.
The first thing that should stop you cold: more than 80 law enforcement agencies used search queries containing racist and stereotypical language targeting Roma people. Searches such as “Roma traveller” or openly derogatory terms were executed without any link to criminal activity.
Let that land for a second. This isn’t one rogue officer having a terrible day. This is a documented, auditable pattern across dozens of agencies, using a feature of the system called “Convoy”… which identifies vehicles travelling together… to target ethnic minority communities based purely on identity, without the inconvenience of any actual suspicion of crime.
Agencies also logged hundreds of searches related to political demonstrations… the 50501 protests in February 2025, Hands Off protests in April, and No Kings protests in June and October.
Now, Flock’s own blog pushes back on the “mass surveillance” framing. The company notes that roughly 19 agencies out of approximately 3,900 Flock customers conducted searches that explicitly referenced protest activity… less than half of one percent. And technically, they’re not wrong about the maths. But I’d argue this is a bit like a hotel chain saying “most of our rooms don’t have mould” as though that’s a ringing endorsement. The question isn’t whether most of the system is fine. The question is: when it goes wrong, how badly does it go wrong, and for whom?
The answer is: catastrophically, and for the most vulnerable people in the room.
The Abortion Search That Should Disturb Everyone
If the racial profiling findings made you uncomfortable, this next section may require you to put the kettle on first.
In Texas, law enforcement used the Flock network under the pretext of a “missing persons investigation,” which in reality targeted a woman who had a self-managed abortion. A single query unlocked access to more than 83,000 cameras nationwide.
Eighty-three thousand cameras. To track one woman. For a medical procedure.
The deeper structural issue here is what privacy advocates call the “shield law loophole.” Several states have passed laws prohibiting local police from sharing data in abortion-related investigations. The problem is that Flock’s architecture doesn’t require your local police to share anything. Out-of-state officers can simply query the national network directly. The shield goes up in one state; the hand reaches through the wall from another.
Flock has since introduced keyword filters that can block attempts to search for terms related to civil immigration or reproductive healthcare where state law forbids it. They say they’re unaware of any credible case of the technology being used to prosecute a woman for reproductive healthcare.
But the Texas case suggests the filters arrive after the fact. The audits reveal that officers frequently enter vague justifications like “investigation” or “suspicious” to sidestep automated checks. You can build a filter. You cannot build away human intent.
The Federal Access Nobody Authorised
One of the more revealing patterns to emerge from the 2025–2026 investigations is the sheer brazenness of what federal agencies were doing with local data… and how little local authorities knew about it.
The Oxnard Police Department suspended all Flock Safety cameras after discovering that, despite setting security to “California only” access, a “vendor-based issue” had enabled a nationwide query feature. Federal agencies could search their data without Oxnard’s knowledge or approval.
The Ventura County Sheriff’s Office conducted an audit and found over 364,000 unauthorised accesses between February and March 2025 alone. The “National Lookup” feature had been disabled in June 2023 to comply with California law… and yet it had continued to function.
San Francisco Police Department cameras were searched by out-of-state agencies over 1.6 million times in seven months.
Flock’s response to all of this is consistent and, if I’m being uncharitable, almost comedically evasive: customers control their own data, the company doesn’t independently share law enforcement information, sharing relationships can be revoked at any time. California Highway Patrol Commissioner Sean Duryee sent a formal letter to Flock’s CEO in November 2025, reaffirming contractual prohibitions against sharing data with the federal government… and months later, audits revealed the exact sharing he’d warned against had continued regardless.
Flock says customers control the data. The data says otherwise.
Now They’re Adding Ears
If you thought the licence plate surveillance was the ceiling, allow me to introduce you to Flock’s “Raven” product.
Flock Safety is rolling out a new gunshot detection system that will also listen to human voices. The system is being advertised with the slogan “Safety you can see and now hear,” with Flock’s marketing materials showing a police alert for “screaming.”
These devices capture sounds in public places and use machine learning to try to identify gunshots and then alert police… but the EFF has long warned that they are also high-powered microphones parked above densely-populated city streets.
Flock and the police departments deploying Raven will tell you it’s event-triggered, not continuous recording, and cannot monitor conversations. In Roanoke, Virginia, where 75 Raven devices have recently been approved, the police department insists the technology “activates only when it detects an acoustic signature of interest.”
Which is reassuring, right up until you ask: who audits what counts as an “acoustic signature of interest”? Who reviews the machine learning model that decides whether a raised voice outside a pub is a “distress sound”? What happens to that audio clip before it’s deleted?
There is no robust legal framework for this. The cameras arrived before the laws did. The microphones are following the same playbook.
One Illinois village trustee explained his vote to cancel the city’s Flock contract by noting that “according to our own Civilian Police Oversight Commission, over 99% of Flock alerts do not result in any police action.” That statistic is doing a lot of quiet work. It means that for every genuine alert, dozens of people are being flagged, checked, possibly visited… for nothing. And each one of those nothing-incidents is a data point that now exists about a real person.
The Pushback That’s Starting to Work
I don’t want to write a piece that ends in pure despair, because I think that’s as intellectually dishonest as uncritical cheerleading. Something genuinely hopeful is happening.
Mountain View, Austin, Eugene, Springfield, Flagstaff, Cambridge, Evanston, Staunton, and multiple Washington state cities have all cancelled their Flock contracts in the past year. The reasons are consistent everywhere: local authorities discover their “local” cameras are feeding a national network accessible to federal agencies, and they pull the plug.
Following the EFF’s reporting, cities from Austin to Evanston terminated contracts with Flock. In California, a lawsuit was filed over millions of warrantless searches. Federal and state authorities launched investigations.
One resident from Sedona, Arizona put it with an economy of language I find genuinely moving: “It was like we were building our own prisons, and we were paying for it.”
That’s the crux of it. The genius… if you can call it that… of the Flock model is that it outsources the cost of surveillance to the communities being surveilled. Your HOA fees. Your local police budget. Your small-town council voting to share camera data in exchange for network access. Everyone pays in. Nobody quite realises what they’ve built until the audit comes back.
What We’re Actually Talking About
The word “panopticon” gets thrown around so often in surveillance discussions that it’s started to lose its teeth. So let me try a different frame.
This isn’t Big Brother in a government tower. That version of authoritarianism is actually easier to resist, because it has a face and an address. What Flock has built is something stranger and in some ways more insidious: thousands of little brothers. Your neighbour’s HOA camera. The local police cruiser with a reader on the roof. The small-town department that signed up for the free trial and didn’t read the terms of service. All of them linked, in real time, by a single company’s algorithm that can reconstruct a person’s movements across state lines in seconds.
The woman in Texas didn’t need to be followed by a government agent. She just had to drive. The protestors didn’t need to be photographed at a rally. They just had to attend one. The Roma communities didn’t need to be individually targeted. They just had to exist in a vehicle that fit a profile.
The constructive criticism I keep coming back to… for Flock, for the cities that signed these contracts, for us as a society that let this infrastructure grow largely unnoticed… is this: the problem was never whether the technology could be misused. Every audit, every cancelled contract, every emergency suspension of cameras that were supposed to be protecting people tells us the same thing. The question was always whether we’d built enough accountability into the system before we found out.
We hadn’t. We still haven’t.
The good news is we’re starting to notice. The better news is that communities are doing something about it. The uncomfortable news is that while you’ve been reading this, Flock’s system has scanned roughly twenty million more licence plates.
Your neighbour’s camera is still watching.
Sources: Electronic Frontier Foundation investigations (2025); State of Surveillance; The Record from Recorded Future News; Flock Safety public statements.
Until Next Time

Discover more from Dominus Owen Markham
Subscribe to get the latest posts sent to your email.

