Your AI Assistant Is Collecting More Data Than You Think—Here’s How to Protect It
If you’ve used a chatbot like ChatGPT, Google Gemini, or an image generator like Midjourney, you’ve probably noticed how quickly it can answer questions or create content. What’s less visible is how much of your personal data these tools collect in the process. A recent article in Computing UK highlights a growing problem: AI adoption is racing ahead, while the data governance needed to keep it safe is lagging behind. For everyday users, that gap creates real privacy risks.
This article explains what’s happening, why it matters, and—most importantly—what you can do today to protect your data.
What’s happening
AI companies rely on large volumes of user data to train and improve their models. When you type a prompt, upload a document, or share a photo, that information is often stored, analysed, and used to refine future responses. The Computing UK report notes that this data collection happens with minimal transparency, and regulators are struggling to keep pace.
Major providers often enable data sharing by default. For example, OpenAI’s default settings may allow ChatGPT conversations to be reviewed by human trainers. Google’s Gemini (formerly Bard) has similar data-use policies. Many users don’t realise that their interactions are not fully private—they become part of the training dataset unless explicitly opted out.
Why it matters to you
The risks go beyond abstract privacy concerns. Data breaches involving AI platforms have already occurred, and the potential for profiling is real. Companies can build detailed profiles based on your conversations, including health questions, work projects, and personal preferences. If that data is compromised, it can lead to identity theft or unwanted targeting.
Moreover, because governance rules like the General Data Protection Regulation (GDPR) in Europe and the California Consumer Privacy Act (CCPA) were written before generative AI exploded, they don’t always cover the nuances of how AI collects, stores, and uses data. Consumers are left with rights that are technically there but hard to enforce.
What you can do right now
You don’t need to stop using these tools, but you should take control of your settings. Here are practical steps that work with most major AI platforms:
Check your data-sharing settings. In ChatGPT, go to Settings > Data Controls and turn off “Improve the model for everyone.” In Gemini, look for “Activity & data” in your Google account and disable “Gemini Apps activity” or set it to auto-delete after a period. Midjourney users should review their account settings for image visibility.
Use temporary or incognito modes where available. Some tools now offer conversation histories that don’t get saved. Enable this for sensitive queries.
Avoid sharing personally identifiable information. Treat AI assistants like public chat rooms. Don’t paste your full name, address, phone numbers, or financial details into prompts.
Understand your rights. Under GDPR, you have the right to access and delete your data. Under CCPA, you can opt out of the sale of your information. Not all AI companies make this easy, but you can submit a data request by email. If you are in the EU or UK, you can also complain to your data protection authority if your request is ignored.
Delete old conversations regularly. Most platforms let you clear your history. Make it a habit every few weeks.
If your data is exposed in a breach, act quickly: change passwords, enable two-factor authentication, and monitor accounts for suspicious activity. Contact the company’s privacy office and consider filing a complaint with your local regulator.
Looking ahead
Regulators are beginning to catch up. The EU’s AI Act and proposed updates to the UK’s data protection framework aim to close some of the gaps. But for now, the burden is largely on consumers. By staying informed and adjusting a few settings, you can reduce your exposure without giving up the benefits of AI.
Sources
- “AI use has outpaced the data discipline that should govern it,” Computing UK (May 2026)
- OpenAI Privacy Policy, as of 2026
- Google Gemini Privacy Help Center, as of 2026
- General Data Protection Regulation (GDPR) – Right to access and erasure
- California Consumer Privacy Act (CCPA) – Opt-out rights