What to Know About Police AI Surveillance—and How to Protect Your Privacy
If you live in Sarasota County, your local sheriff’s department has recently added more AI-powered surveillance tools. A story in the Sarasota Herald-Tribune from early May 2026 details how the sheriff’s office is expanding its use of cameras and analytics that can automatically detect and track people and vehicles. The move has drawn criticism from privacy advocates who say the public was not given enough say before the technology was deployed.
This isn’t just a local Florida story. Police departments across the country are buying similar systems—facial recognition, automated license plate readers, predictive analytics software—often without clear public debate. If you’re concerned about how your data is collected and used, it helps to understand what these tools can and can’t do, and what you can actually do about it.
What happened in Sarasota
According to the Herald-Tribune report, the Sarasota sheriff’s office quietly expanded its use of AI surveillance software that integrates with existing camera networks. The system can flag suspicious activity in real time, match faces against watchlists, and log every plate that passes under a camera. The sheriff’s office said the goal is to deter crime and speed up investigations. Critics, including the ACLU of Florida, countered that the expansion happened without a public hearing or clear policies on how long data is kept, who has access, and whether the algorithms have been tested for bias.
This pattern is common. Departments often deploy AI surveillance citing crime reduction, but details about the technology’s accuracy and oversight are scarce. Facial recognition, in particular, has been shown to misidentify people of color at higher rates in some studies, raising civil rights concerns.
Why it matters for your privacy
Even if you’ve never been charged with a crime, AI surveillance can affect you. These systems collect data on everyone in public spaces—not just suspects. That includes your face, your car’s location history, and inferences about your behavior. In many cases, there is no warrant requirement for using these tools in public, and little to no federal regulation.
The main risks are:
- Data permanence. Images and plates can be stored for months or years. Combined with other databases, they can create detailed profiles of where you go, who you meet, and what you do.
- Lack of transparency. Most police contracts with surveillance vendors contain nondisclosure clauses, so the public never sees audits or performance reviews.
- Potential misuse. Data intended for public safety can be used for immigration enforcement, monitoring reporters or activists, or by companies that buy access from law enforcement.
That’s not to say all AI surveillance is inherently bad. In some cases, it has helped solve violent crimes. But the trade-off between safety and privacy is rarely debated in the open, and the burden often falls on individuals to protect themselves.
What you can do
While you can’t opt out of being in public, you can take steps to reduce your exposure and advocate for change.
- Limit what you share online. Avoid posting real-time locations or detailed daily routines. Photos you upload can be scraped and used to build facial recognition databases.
- Check your car’s privacy settings. Many modern cars automatically share location data with the manufacturer. You can usually disable this in the settings menu.
- Ask your local police department about surveillance. Send a public records request for your department’s use of facial recognition or license plate readers. Tools like MuckRock can help.
- Support state-level oversight bills. Several states are considering laws that require departments to get approval before buying surveillance tech. Let your state representatives know you support transparency.
- Use privacy tools for digital anonymity. A VPN and temporary browser profiles won’t protect you from street cameras, but they can limit the connection between your online activity and your physical identity.
Sources
- Sarasota Herald-Tribune: “Sarasota sheriff expands AI-powered surveillance amid privacy concerns” (May 2026).
- ACLU of Florida: statements on the lack of public input in the Sarasota deployment.
This article is for informational purposes and does not constitute legal advice. Surveillance practices vary by jurisdiction; check local laws for specifics.