AI-Powered Police Surveillance Is Spreading—Here’s What It Means for Your Privacy
In early May 2026, the Sarasota County Sheriff’s Office expanded its use of artificial intelligence–driven surveillance tools. The move, covered by the Sarasota Herald-Tribune, added more cameras and analytical capability to a system that already included facial recognition and automated license plate readers. Sarasota is not an outlier: law enforcement agencies across the United States are quietly building similar networks, often with limited public debate. The trend raises real privacy risks for anyone living, working, or simply passing through areas where these systems are deployed.
What Happened
According to the Herald-Tribune report, the sheriff’s office upgraded its surveillance infrastructure to incorporate additional AI software that can analyze video feeds in real time. The system is designed to flag suspicious behavior, recognize faces, and track vehicles across a network of cameras. The office described the expansion as a public safety measure intended to prevent crime and speed up investigations. Privacy advocates, including the ACLU and local digital rights groups, have raised concerns about the lack of clear oversight and the potential for misuse. They point out that the technology can collect data on people who have done nothing wrong—not just on suspects.
Why It Matters
AI-powered surveillance differs from older closed-circuit TV systems in two critical ways: scale and automation. Where a human operator might watch a handful of screens, an AI can monitor hundreds of feeds simultaneously, compare faces against databases, and build profiles of people’s movements over time. This creates several privacy problems.
First, consent is essentially absent. You don’t opt into being recorded when you walk down a public sidewalk, but you also have no easy way to avoid it if cameras are everywhere. Second, data retention policies vary widely. Some departments keep footage for weeks; others hold onto it for years. Third, facial recognition is known to have higher error rates for people with darker skin tones and women, raising the risk of false identifications. A mistaken match could lead to an unwanted encounter with police.
Beyond individual errors, there’s a broader “chilling effect.” When people know they are being watched, they may avoid lawful activities—attending a protest, visiting a political meeting, or even just loitering in a public square. That changes the character of public life in ways that are hard to measure but deeply consequential.
What Readers Can Do
You can take steps to protect your privacy, even if you live in a city that has adopted AI surveillance.
- Learn what your local police are using. Start by searching for “police surveillance technology [your city/county]” or checking the website of your local sheriff’s office. Some jurisdictions post annual surveillance reports. If nothing is available, file a public records request. It’s your right.
- Limit exposure to facial recognition. States like Florida, Texas, and Illinois have laws that restrict how private companies collect and use biometric data. You can opt out of some commercial databases (for example, by visiting Facebook’s facial recognition settings). But police databases are harder to avoid. One small step: wear a mask in public when it’s appropriate (e.g., during flu season) or use a face-obscuring item like a scarf or sunglasses. This isn’t foolproof, but it makes it harder for automated systems to get a clean match.
- Use privacy tools for your car. Many police networks use license plate readers that record where your car goes. You can reduce exposure by parking in garages, using alternate routes, or—in some states—applying for a non-descript plate that doesn’t tie back to your address (e.g., for domestic violence survivors). Consider using car covers when parked to obscure the plate.
- Advocate for local policy. Many cities have adopted “surveillance ordinances” that require police to get city council approval before buying or expanding surveillance tools. If your city doesn’t have one, contact your local council member and ask what rules govern police use of AI. Join or follow groups like the Electronic Frontier Foundation or the ACLU for model legislation.
- Encrypt your communications. While this doesn’t stop street cameras, it protects your digital trail. Use end-to-end encrypted messaging apps (Signal, WhatsApp) and a VPN when on public Wi-Fi. These won’t stop police from watching you physically, but they limit the data companies can hand over.
Sources
- “Sarasota sheriff expands AI-powered surveillance amid privacy concerns,” Sarasota Herald-Tribune, May 3, 2026. (Specifically the report on the new camera and software deployment.)
- American Civil Liberties Union (ACLU) and Electronic Frontier Foundation (EFF) — general background on facial recognition and police surveillance concerns.