What the Fireflies.AI Lawsuit Means for Your Privacy When Using Meeting Assistants
AI meeting assistants like Fireflies.ai, Otter.ai, and Rev have become common in remote work. They transcribe conversations, summarize action items, and claim to boost productivity. But a recent lawsuit against Fireflies.ai has drawn attention to a less visible cost: the collection of biometric data such as voice prints and facial expressions—often without explicit consent. The case, filed under Illinois’ Biometric Information Privacy Act (BIPA), offers a practical warning for anyone using these tools.
What Happened
In early 2026, Fireflies.ai was sued by a user who alleged that the company collected and stored voice prints and facial scans (from video recordings) without proper notice or consent. The complaint further claimed that Fireflies shared this biometric data with third parties, including for analytics and emotional state analysis. The lawsuit relies on BIPA, which requires companies to obtain written consent before collecting biometric identifiers and to disclose how long they will be kept. (See: AI Meeting Assistants and Biometric Privacy: Governance Lessons from the Fireflies.AI Lawsuit, The National Law Review; also AI Meeting Assistants and Biometric Privacy: Lessons from the Fireflies.AI Lawsuit, Law.com.)
Fireflies is not the only platform facing scrutiny. Many meeting assistants record audio and video, and some analyze tone, pace of speech, and facial expressions. Even if a tool only transcribes spoken words, the underlying voice recording can be used to create a voiceprint—a biometric identifier similar to a fingerprint. The lack of clear disclosure in many apps means users may be unaware their unique vocal characteristics are being captured.
Why It Matters
Biometric data is considered sensitive because it is permanent. Unlike a password, you cannot change your voiceprint if it is leaked. Laws in Illinois, Texas, Washington, and increasingly other states impose strict requirements on its collection and use. BIPA alone has led to billions of dollars in settlements from companies that mishandled fingerprints and face scans.
For an individual, the risk is not purely legal. A meeting assistant vendor that collects biometric data may store it on servers you do not control, potentially exposing it in a breach. For a small business or remote team, using such a tool without checking its privacy policy means you might be exposing your colleagues’ biometric data without their knowledge. If a lawsuit or regulatory action arises, the business could share legal liability.
Because the Fireflies case is ongoing, we do not yet know whether the court will rule that the platform violated BIPA. What is already clear is that the complaint has prompted users and vendors to look more carefully at how these tools handle voice and video data. Newer privacy policies now often mention biometric data explicitly, but many older policies remain vague.
What You Can Do
If you or your team rely on an AI meeting assistant, here are several practical steps to reduce risk:
- Check the tool’s privacy policy for biometric data. Look for terms like “voiceprint,” “facial geometry,” “biometric identifier,” or “emotion analysis.” If the policy does not mention these, contact the vendor and ask.
- Disable video recording by default. If you do not need the assistant to analyze facial expressions, turn off the camera feed for transcription purposes.
- Review app permissions on your device. On your phone or computer, see whether the meeting assistant has permission to access the camera or microphone when it is not actively being used. If so, revoke that permission.
- Consider using a pseudonym or an avatar. Some tools allow you to replace your video with an animated avatar that does not capture your actual face. Voice still may be recorded, but removing the visual component reduces one risk.
- For business accounts, obtain written consent from meeting participants. Sending a brief notice—for example, “This meeting will be transcribed by [Tool Name]. No voiceprints or facial data will be stored or shared”—can satisfy legal requirements and build trust.
- Limit data retention in the tool settings. Many services allow you to auto-delete recordings after a set period. Choose the shortest duration that still meets your needs.
Businesses that deploy these tools should also audit their current privacy policies and contracts with vendors. If you are a small business owner, consider whether the productivity gain justifies the potential liability. In some cases, a simpler note-taking tool that does not record or analyze audio may be sufficient.
The Fireflies.ai lawsuit is a reminder that convenient technology often carries hidden privacy implications. As state laws tighten, both consumers and businesses will benefit from actively managing biometric data rather than assuming it is safe.
Sources
- “AI Meeting Assistants and Biometric Privacy: Governance Lessons from the Fireflies.AI Lawsuit,” The National Law Review (May 4, 2026).
- “AI Meeting Assistants and Biometric Privacy: Lessons from the Fireflies.AI Lawsuit,” Law.com (February 11, 2026).
- Illinois Biometric Information Privacy Act (740 ILCS 14), as cited in the above articles.