AI Surveillance Is Coming to a Town Near You: What to Do About It

In May 2026, the Sarasota County Sheriff’s Office confirmed it had deployed a new AI-powered surveillance system capable of real-time video analytics and facial recognition. The program, provided by a third‑party vendor, marks one of the most aggressive local expansions of automated policing technology in Florida. Privacy advocates immediately raised concerns about the lack of public debate, the risk of data misuse, and the potential for mission creep.

What happened in Sarasota is not an isolated incident. Similar programs are already active in at least a dozen other states. As these tools become more common, it’s worth understanding how they work, what the real privacy risks are, and what steps you can take now—regardless of where you live.

What Happened

According to the Sarasota Herald‑Tribune, the sheriff’s office is using a system that integrates hundreds of existing cameras with AI software capable of detecting suspicious behavior and matching faces against watchlists. The system can also read license plates automatically. Officials say the goal is faster response times and more efficient use of limited law enforcement resources.

What is less clear is how long data is retained, who outside the sheriff’s office can access it, and whether any independent oversight exists. Local privacy groups have called for a moratorium and for public hearings before any further expansion. The sheriff’s office has said it follows state law on data retention, but Florida’s rules on surveillance data are often vague or permissive.

Why It Matters

AI surveillance raises several concrete privacy concerns, not hypothetical ones.

Data permanence. Once your image or license plate is captured and analyzed, that record can stick around indefinitely. If a vendor goes bankrupt or changes ownership, data could shift to new hands with different policies. And without clear expungement rules, old data might be used for investigations unrelated to the original reason it was collected.

Mission creep. Systems justified for “violent crime response” often end up used for minor traffic offenses, protest monitoring, or tracking people who have not been accused of anything. This expansion of what the system is actually used for happens gradually and quietly.

Lack of transparency. Many contracts with surveillance vendors contain non‑disclosure clauses or are treated as confidential. That means the public often cannot even see the terms—let alone evaluate the accuracy of the algorithms or challenge a mistaken identification.

Chilling effects. When people know they are under constant watch, they may change their behavior in ways that harm free expression and assembly. This is especially concerning for vulnerable communities who already face disproportionate scrutiny.

What Readers Can Do

You do not have to wait for your local sheriff to announce a similar program. Here are practical steps you can take now.

1. Learn your state’s surveillance laws. Some states require public notice or a warrant for real‑time tracking; others do not. The Electronic Frontier Foundation (EFF) and the ACLU maintain state‑by‑state guides to facial recognition and license plate reader laws. A little research can tell you what your local police are allowed to do.

2. Cover your license plate when parked on private property. Many plate readers capture images of parked cars in parking lots or driveways. If your vehicle is legally parked on private property (e.g., your driveway or a private garage), covering the plate is generally legal. Note that driving with a covered plate is not.

3. Opt out of commercial face databases where possible. Some states require companies to allow you to request deletion of your faceprint collected by a third party. This is a developing area, but it is worth checking whether the vendor used by your local agency offers any opt‑out mechanism.

4. Attend local government meetings and ask questions. Most surveillance expansions go through city councils, county commissions, or sheriff’s budget approval processes. Public comments can pressure officials to demand transparency reports, require warrants, and negotiate data retention limits. You can also ask for a “surveillance impact assessment” before new technology is deployed.

5. Support or propose a surveillance oversight ordinance. A growing number of cities now require that any new surveillance technology be approved by the elected body, that regular audits be published, and that data not be shared with immigration enforcement or out‑of‑state agencies without a court order. Model ordinances are available from organizations like the ACLU and Fight for the Future.

Sources

  • Sarasota Herald‑Tribune: “Sarasota sheriff expands AI‑powered surveillance amid privacy concerns” (May 3, 2026).
  • Electronic Frontier Foundation – “Street Level Surveillance: Automatic License Plate Readers”
  • ACLU – “Facial Recognition and Privacy” state legislation tracker

The Sarasota case is a useful lens for a national conversation. AI surveillance is not science fiction; it is already being paid for by local tax dollars. The question is not whether it will arrive in your town, but whether citizens will get a say in how it is used—and how to protect your privacy while it is here.