AI Surveillance by Local Police Is Growing – What It Means for Your Privacy

In May 2026, the Sarasota County Sheriff’s Office announced an expansion of its AI-powered surveillance tools, including new facial recognition capabilities, automated license plate readers, and predictive analytics software. The move, reported by the Sarasota Herald-Tribune, is part of a broader trend among local law enforcement agencies across the country adopting artificial intelligence to monitor public spaces. While officials tout these tools as aids for crime prevention and response, privacy advocates warn that the expansion comes with significant risks to civil liberties, often with little public oversight.

Here’s what the Sarasota case illustrates about how AI surveillance works, why it matters for your privacy, and what you can do if you’re concerned.

What happened in Sarasota

The sheriff’s office did not release full details of the expanded system, but according to the Herald-Tribune, the new capabilities include:

  • Integration of facial recognition with existing camera networks (including private cameras enrolled in the sheriff’s database).
  • An increase in the number of automated license plate readers (ALPRs) placed on major roads.
  • Upgraded software for drone surveillance that can automatically track objects and people.
  • Predictive policing tools that generate “hotspot” maps for officers to patrol.

A sheriff’s spokesperson stated the AI tools would be used “to solve crimes faster and keep deputies safe,” and that privacy protections were built in. However, no detailed privacy impact assessment was made public at the time of the announcement.

Why it matters for your privacy

Local police AI surveillance systems raise several concerns that directly affect everyday people, not just those suspected of crimes.

Data retention and sharing. Many agencies store data from ALPRs and cameras for months or years, even when no crime has occurred. That data can be shared with federal agencies or private companies, sometimes without a warrant. Sarasota’s policy on data retention is unclear from public reports, which is itself a red flag.

Bias and accuracy issues. Facial recognition systems have been shown to misidentify people with darker skin tones at higher rates. A 2019 study from the National Institute of Standards and Technology found that many algorithms had higher false positive rates for Black and Asian faces. Errors can lead to wrongful suspicion or even arrest.

Chilling effect on public life. Knowing you are being recorded and analyzed by AI can discourage people from attending protests, speaking in public, or simply going about their daily routines. The American Civil Liberties Union and other groups have argued that this undermines the freedom of assembly and expression.

Lack of oversight. Many local police departments adopt AI surveillance tools through grants or vendor contracts without seeking public input or city council approval. In Sarasota, the expansion was announced via a news release, not a public hearing. Residents had no chance to question how the technology would be used or what safeguards would be in place.

What you can do if you’re concerned

You don’t need to be a privacy expert to take practical steps. Here are things that can help, whether you live in Sarasota or anywhere else.

Check your local police department’s surveillance policies. Start with the website of your city or county sheriff. Look for “data retention policies,” “surveillance technology,” or “privacy impact assessments.” If you can’t find anything, submit a public records request asking for a list of surveillance tools in use, their data retention periods, and any agreements with third parties.

Attend town halls and city council meetings. When a new surveillance contract is up for approval, public comments can make a difference. Even if you can’t attend, send an email to your council members or sheriff. Ask what privacy protections are in place and whether an independent audit will be conducted.

Reduce your exposure to public surveillance. While you can’t avoid cameras in public entirely, you can limit the data trail you leave behind:

  • Cover your laptop webcam when not in use, and be aware that public Wi-Fi cameras (e.g., in coffee shops) may be monitored.
  • Use encrypted messaging apps like Signal instead of SMS. The police cannot intercept encrypted messages without a warrant (and in many cases, they cannot read them at all).
  • Disable facial recognition on social media platforms. Some services (like Facebook) allow you to opt out of automatic tagging.
  • Remove your car’s license plate from databases? Not easily, but you can avoid driving through ALPRs by taking back roads when possible — though this is not a guarantee.

Support local privacy advocacy groups. Organizations like the Electronic Frontier Foundation and ACLU often have local chapters that track surveillance proposals. They can alert you when your town is considering a new AI tool and how to push back.

Sources

  • Sarasota Herald-Tribune: “Sarasota sheriff expands AI-powered surveillance amid privacy concerns” (May 3, 2026)
  • National Institute of Standards and Technology: “Face Recognition Vendor Test” (2019)
  • American Civil Liberties Union: “AI and Policing” resources

This article is based on public reporting as of May 2026. Some details about Sarasota’s specific policies were not publicly available at the time of writing. If you have more current information, check your local government’s official website.