Your AI Assistant Is Watching: How to Protect Your Privacy Now
Open up ChatGPT, ask Google Assistant about the weather, or let an email app draft a reply for you. These tools feel harmless, even helpful. But behind the convenience, a growing amount of your personal information is being collected, stored, and used in ways you may not have agreed to.
AI’s erosion of privacy is not a distant concern. It’s happening now, every time you talk to a smart speaker or accept a chatbot’s terms of service without reading them. This article explains what is being collected, why it matters, and what practical steps you can take to limit exposure.
What Happened
Over the past year, a number of reports have highlighted how AI tools quietly gather data. Heather Parry’s recent Substack article on “AI’s erosion of privacy” is one of several that have drawn attention to the trend. The concern is not new, but it is accelerating as AI becomes embedded in more everyday applications.
Most major AI services—ChatGPT, Google Assistant, Amazon Alexa, Microsoft Copilot—collect conversation logs by default. These logs contain not just your queries but often metadata like time, location, device type, and sometimes audio recordings. The data is used to train and improve the models, but it can also be shared with third parties or retained for years.
In some cases, companies have admitted that human reviewers may read your chats to check for quality or safety violations. Even if your name is attached indirectly, the content of your conversations can reveal deeply personal information.
Why It Matters
The erosion of privacy here is gradual. Each individual interaction may seem trivial, but the aggregate creates a detailed profile of your habits, opinions, relationships, and movements.
Consider a few concrete risks:
- Training data re-identification. Even after data is anonymized, researchers have shown it is possible to reconstruct personal details from training datasets. Nothing sent to an AI model can be guaranteed to stay anonymous forever.
- Unwanted retention. Many AI platforms let you delete your chat history, but deletion is not always thorough. Copies may remain in backups for months or years.
- Third-party access. If you use an AI tool through a third-party app (e.g., a note-taking app with an integrated AI assistant), that app’s privacy policy may allow additional sharing.
- Legal requests. Law enforcement or civil litigants can subpoena chat logs just like any other digital record.
The danger is that many consumers assume AI tools are as private as a search bar. They are not. The data you share can outlive your original intent.
What Readers Can Do
You do not have to stop using AI tools altogether. But you can take several steps to regain control:
1. Turn off chat history saving.
- ChatGPT: Go to Settings → Data Controls and disable “Chat history & training.” This also stops OpenAI from using your data for model improvement.
- Google Assistant: Go to Assistant settings → Your data → Delete activity or turn off Voice & Audio activity.
- Amazon Alexa: In the Alexa app, go to Settings → Alexa Privacy → Manage Your Data and disable voice recordings retention.
2. Use incognito or ephemeral modes when possible.
- Some tools offer a temporary mode that does not save conversations to your account. Use it for sensitive questions.
- If no such mode exists, consider copying your output and manually deleting the session.
3. Check third-party app permissions.
- If you use AI features within other apps (e.g., Grammarly, Notion AI), review what data those apps can access. Look for settings like “Allow use of content to train models” and disable them.
4. Limit what you type.
- Avoid sharing personally identifiable information (full name, home address, phone numbers, financial details) in AI prompts unless absolutely necessary. If you must, assume it could be stored.
5. Delete old data periodically.
- Build a habit of going into your account’s privacy dashboard every few months and deleting stored chat logs, voice recordings, and search histories.
6. Consider privacy-focused alternatives.
- Tools like DuckDuckGo’s AI Chat (which does not store conversations) or local-only models (e.g., running a small LLM on your device) can reduce cloud exposure.
No single step is perfect, but together they significantly reduce the amount of data that leaves your control.
Sources
- Heather Parry, “AI’s erosion of privacy,” Substack, April 2026.
- OpenAI Privacy Policy and Data Controls (accessed 2026).
- Google Assistant privacy settings documentation.
- Amazon Alexa Privacy Hub.