What the Fireflies.AI Lawsuit Means for Your Privacy When Using AI Meeting Assistants
AI meeting assistants like Otter.ai, Fireflies, and Zoom’s AI Companion have become indispensable for many professionals. They transcribe conversations, summarize action items, and even generate follow-up emails. But a recent lawsuit against Fireflies.AI has raised a question that most users haven’t considered: what happens to your voiceprint after the meeting ends?
The suit, filed under Illinois’ Biometric Information Privacy Act (BIPA), alleges that Fireflies.AI recorded and stored the voiceprints of meeting participants without obtaining proper consent. Voiceprints are considered biometric data under BIPA, and companies must follow strict rules when collecting them. This case is not just a legal wrinkle—it’s a practical warning for anyone who joins a meeting that’s being transcribed by an AI tool.
What Happened
The core of the complaint is straightforward: Fireflies.AI, a popular service that joins meetings as a bot and transcribes audio, is accused of capturing the unique vocal characteristics of participants who never agreed to have their voice data collected. Under BIPA, companies must inform individuals in writing that biometric data is being collected, explain the purpose and duration of collection, and obtain a written release. Simply dropping a recording bot into a meeting without clear consent can violate the law.
BIPA allows individuals to sue for $1,000 per negligent violation and $5,000 per intentional or reckless violation. For a tool used across hundreds of meetings, the potential damages add up quickly. The lawsuit is ongoing, so the final outcome is uncertain. But it has already sparked discussion about how AI meeting assistants handle biometric data—and what users should be watching for.
Why It Matters to You
If you’ve ever been in a meeting where an AI assistant was present, there’s a good chance your voice was processed and stored. Many tools claim to collect only “speech-to-text” data rather than biometric identifiers, but the line can be blurry. A voiceprint is a mathematical model of your unique vocal patterns. Once stored, it can in theory be used to identify you across recordings—even if the company says they don’t intend to.
Beyond the legal risk for companies, this case highlights a gap in consumer awareness. Most meeting assistants’ privacy policies are long documents that users rarely read. Even when you install one of these tools yourself, the default settings may share audio recordings with third parties for training, or retain data indefinitely. Participants who are added to a meeting by someone else often have no say at all.
The Fireflies lawsuit shows that the law in some states, especially Illinois, gives consumers a direct right to sue. Other states have weaker biometric laws or none at all. So your level of protection depends largely on where you live—and on where the company stores and processes data.
What You Can Do Right Now
You don’t need to abandon AI meeting tools to protect your voiceprint. A few practical steps can reduce your exposure:
Check your own tool’s privacy settings. If you are the account holder for a tool like Fireflies, Otter, or Zoom AI Companion, look for settings related to audio retention, biometric data collection, and third-party sharing. In Zoom, for example, account administrators can disable AI Companion features for their organization.
Review the privacy policy before using a new tool. Pay special attention to sections on biometric data, consent requirements, and data retention periods. If a policy says voice data may be used for “improving services” or “training models,” ask the vendor what exactly that means.
Turn off recording when you’re a guest. If someone invites you to a meeting and you see a recording bot, you can ask the host to disable audio recording or to confirm that the tool does not store voiceprints. In many cases, the host can set the bot to transcribe only speaker labels (e.g., “Person 1”) rather than full audio.
Be aware of state laws. If you live in Illinois, Washington, Texas, or California, you have stronger biometric privacy protections. You can ask companies whether they collect biometric data from you and, under BIPA, you may be entitled to sue for violations.
Use a separate meeting ID for sensitive calls. For conversations involving personal or financial topics, consider using a meeting platform without AI features, or ask participants to agree in advance to any transcription.
The Broader Landscape
The Fireflies lawsuit is not an isolated event. Several class actions have been filed against tech companies under BIPA for voice and facial recognition practices. As AI tools become more embedded in daily work, the pressure on companies to obtain meaningful consent—and to be transparent about data use—will only increase.
In the meantime, users should assume that any AI meeting assistant they use may be collecting more than just text. Whether that data is treated as a biometric identifier depends on the company’s technical implementation and legal compliance. Until regulators provide clearer rules, the safest approach is to treat your voiceprint like a password: something you don’t want floating around in a data center without your permission.
Sources: The National Law Review, “AI Meeting Assistants and Biometric Privacy: Governance Lessons from the Fireflies.AI Lawsuit”; Law.com, “AI Meeting Assistants and Biometric Privacy: Lessons from the Fireflies.AI Lawsuit.”