5 things you should never share with an AI chatbot to protect your money
AI chatbots like ChatGPT, Gemini, and Claude have become everyday tools for answering questions, drafting emails, and even helping with financial planning. But as their use grows, so do the risks. Recent reports from the Washington Post, the BBC, and consumer protection groups highlight a simple truth: these services are not private vaults. Sharing the wrong information can lead to identity theft, drained accounts, or scams.
Here’s what you need to know—and what to avoid.
What happened
In April 2026, the Washington Post ran a column warning users not to share certain personal and financial details with AI chatbots. The article cited growing evidence that data fed into chatbots can be stored, used for training, or leaked. Around the same time, the BBC demonstrated that AI systems can be hacked—showing that a researcher could extract user conversation data from ChatGPT and Google’s AI in about 20 minutes. The National Council on Aging also reported that older adults are increasingly targeted by scams that use AI to mimic voice or text, making fraudulent requests harder to spot.
These are not isolated incidents. The risks are real and affect everyday users.
Why it matters
When you give a chatbot your bank account number, Social Security number, or detailed investment plans, you are effectively handing that data to a third party. Even if the company behind the chatbot follows strong security practices, vulnerabilities exist. Conversations may be stored on servers, used for model training, or accessed by employees. In the worst case, a hacker who breaches the system could obtain your financial information.
Scammers are also exploiting chatbot features. They may pose as a real person—or even a chatbot—and ask for money, gift cards, or logins. Because AI can now generate convincing text and voice, it’s easier than ever for a fraudster to impersonate a friend, relative, or customer support agent.
What readers can do
You can still use AI chatbots for general financial questions—for example, “How do I calculate compound interest?”—but never share the following five things.
1. Your full name, address, and Social Security number
These three pieces of information are the foundation of identity theft. With them, a scammer can apply for credit, file fraudulent tax returns, or open accounts in your name. Even if you trust the chatbot provider, there’s no guarantee the data won’t be exposed later. Keep your SSN and home address offline entirely when using AI tools.
2. Bank account numbers and credit card details
Never type your full account number, routing number, or credit card number into a chatbot. If a hacker gains access to your chat history, they could initiate unauthorized transactions. Use only secure, official banking channels—like your bank’s app or website—for any transaction-related activity.
3. Login credentials and passwords
This should be obvious, but people still do it. A chatbot does not need your password to, say, “help you remember it” or “check for security issues.” If a chatbot asks for your username and password, it is almost certainly a scam. Real services will never request them via a chat interface.
4. Specific investment holdings or strategies
Sharing your portfolio details, trade confirmations, or retirement account numbers can give scammers a blueprint. They may use that information to impersonate you, manipulate you into moving money, or sell the intel to others. Keep investment discussions general—stick to concepts, not personal positions.
5. Any request for money or gift cards – likely a scam
If a chatbot—or someone claiming to be a friend or company—asks you to send money, buy gift cards, or transfer funds, treat it as a red flag. Legitimate organizations do not demand payment via chatbots. Even if the request appears urgent or emotional, stop and verify through another channel (a phone call, not a text).
General precautions
- Use privacy settings on AI platforms: opt out of data training if possible.
- Avoid browser extensions that give chatbots access to your opened tabs or email.
- Never paste sensitive documents or PDFs containing personal numbers into a chat window.
- Use a separate, non-personal account (or a burner email) for experimentation with AI tools.
Sources
- Washington Post, “Column | Don’t tell your AI chatbot these 5 things to keep your money safe,” April 2026.
- BBC, “I hacked ChatGPT and Google’s AI – and it only took 20 minutes,” February 2026.
- National Council on Aging, “Top 5 Financial Scams Targeting Older Adults and How to Avoid Them,” March 2026.
- NerdWallet, “Should You Use AI for Personal Finance? What to Consider and What to Avoid,” February 2026.