5 things you should never tell your AI chatbot (to protect your money and identity)
AI chatbots like ChatGPT, Google Gemini, and Microsoft Copilot have become fixtures of daily life. They help us write emails, plan vacations, and even draft grocery lists. But as these tools handle more of our personal tasks, a quiet risk grows: we’re telling them things we should keep private.
A recent column in The Washington Post highlighted five categories of information you should never share with an AI chatbot if you want to keep your money and identity safe. The advice is straightforward, but many users still overlook it.
What happened
The column, published in late April, came amid a surge in AI adoption and a parallel rise in scams and data leaks. Researchers have shown that chatbots can be tricked into revealing user conversations (BBC reported one case where a chatbot was hacked in under 20 minutes). Meanwhile, the National Council on Aging reports that financial scams targeting older adults are increasing, and AI chatbots are a new vector for those attacks.
The core problem is that many people treat chatbots like trusted friends. They paste in bank account numbers to get budgeting advice, share their full address to check local weather, or type out passwords to reset an account. The chatbot—and the company behind it—stores that conversation. If it’s ever breached, or if the data is used to train future models, that sensitive information can leak.
Why it matters
Chatbots are not confidential advisors. Most platforms log conversations, and some use them to improve their models. Even if your chat is encrypted in transit, the end provider can still access it. And once data enters a large language model’s training pipeline, it can be difficult or impossible to fully erase.
The five items to keep out of any chatbot conversation are:
Full financial account numbers – bank accounts, credit cards, investment accounts. Use generic descriptions instead (“my checking account”) if you need to ask a hypothetical question.
Social Security number or any government ID number – these are the keys to identity theft. A chatbot has no legitimate reason to store them.
Login credentials – usernames and passwords for any account. No AI assistant can guarantee they won’t be exposed in a breach.
Security questions and answers – things like “What was your first pet’s name?” or “Your mother’s maiden name.” These are used to recover accounts, and once shared, they’re no longer secret.
Sensitive personal details – your full address, date of birth, or other identifying information. Scammers can combine small pieces of data from multiple sources to impersonate you.
These aren’t hypothetical risks. The NCOA report and NerdWallet’s guidance both urge caution, especially for older adults who may be less familiar with how chatbots work.
What readers can do
You can still use AI chatbots safely. The key is to treat them like public tools, not private advisors.
- Never paste raw sensitive data into a chat. If you need help analyzing a bank statement, remove account numbers and names first.
- Use a dedicated password manager for credentials, and never ask a chatbot to generate or store a password for you.
- Check your chatbot’s privacy settings. Many platforms allow you to delete conversation history or opt out of training data collection.
- If you already shared something, delete the conversation (if possible) and, crucially, change any exposed passwords or account numbers. Monitor your accounts for unusual activity.
- Be cautious with third-party chatbot apps that ask for access to your email, contacts, or files. They may have weaker security.
- Never assume a chatbot is confidential. Even if a company promises encryption, the data is still on their servers.
The safest approach is simple: do not tell a chatbot anything you would not shout in a crowded coffee shop.
Sources
- The Washington Post column, “Don’t tell your AI chatbot these 5 things to keep your money safe,” April 2026.
- BBC, “I hacked ChatGPT and Google’s AI – and it only took 20 minutes,” February 2026.
- National Council on Aging, “Top 5 Financial Scams Targeting Older Adults and How to Avoid Them,” March 2026.
- NerdWallet, “Should You Use AI for Personal Finance? What to Consider and What to Avoid,” February 2026.
These sources are credible and current. The advice they offer aligns with basic digital hygiene that predates chatbots but becomes even more critical as AI integrates into everyday life.