AI Surveillance by Police Is Expanding: What It Means for Your Privacy

You may not have noticed, but in Sarasota, Florida, the sheriff’s office recently expanded its use of AI-powered surveillance cameras. The move, reported by the Sarasota Herald-Tribune, adds more automated license plate readers and facial recognition capabilities to a network that already monitors public spaces. Sarasota is not an outlier. Over the past few years, police departments in cities like Detroit, New York, and Los Angeles have quietly adopted similar tools. The trend raises a basic question: how much of your daily life is being tracked by artificial intelligence, and what can you do about it?

What happened

The Sarasota sheriff’s office added new AI-powered cameras that can scan license plates and, in some cases, identify faces in real time. According to the Herald-Tribune, the agency says the technology helps solve crimes faster and deters offenders. But the expansion happened with limited public discussion. Privacy advocates point out that there was no formal vote by county commissioners or a public hearing before the system went live. The sheriff’s office did not release details on how long data is stored, who has access, or what rules govern its use.

This pattern is common. Many police departments acquire surveillance tools through grants or federal programs without local oversight. The Sarasota case is one more example of a technology being deployed before the community has a chance to weigh the trade-offs.

Why it matters

For the average person, the risks are not abstract. AI surveillance systems collect massive amounts of data on everyone who passes through a monitored area, not just suspected criminals. That includes you driving to work, walking your dog, or picking up groceries. The data can be kept indefinitely, shared with other agencies, and used for purposes beyond the original justification—a problem known as “function creep.”

Facial recognition in particular has a documented accuracy problem. Studies have shown that false positives are higher for people of color, women, and older adults. A mistaken match can lead to wrongful detention or worse. In Detroit, a man was arrested and held for hours after AI software incorrectly flagged his driver’s license photo as matching a shoplifting suspect.

There is also the question of transparency. Most police surveillance systems are operated under nondisclosure agreements with vendors, meaning the public never sees the algorithms or performance audits. You have no way of knowing if the system is working as promised.

What readers can do

You cannot opt out of being recorded in public. But you can take steps to reduce your exposure and push for accountability.

Stay informed about local surveillance. Search for your city’s police technology contracts. Many are available through public records requests. Groups like the ACLU track surveillance ordinances and can point you to advocacy efforts.

Attend city council or county commission meetings. When police departments ask for budget approval for new cameras, those meetings are often lightly attended. A few people speaking up can shift the conversation.

Support legislation that requires oversight. Some cities now require an ordinance before police can buy or use facial recognition. Others mandate annual audits of accuracy and usage. Public pressure can get these rules passed.

Limit your digital footprint where possible. While you cannot avoid cameras, you can reduce the amount of identifiable data you generate. Use cash when you can. Don’t post your exact location in real time. Consider a VPN for online activity, though it won’t affect physical surveillance.

Speak to your local representatives. A short email or phone call to your city council member or county commissioner asking for a surveillance transparency policy can make a difference. Even a handful of constituents can get attention.

Sources

  • “Sarasota sheriff expands AI-powered surveillance amid privacy concerns,” Sarasota Herald-Tribune, May 2026.
  • Reports of similar deployments and challenges in Detroit and New York, as documented by the ACLU and local news outlets.