5 things you should never tell an AI chatbot — they could drain your bank account
Chatbots like ChatGPT, Google Gemini, and Microsoft Copilot are useful for drafting emails, summarizing documents, or answering general questions. But as their use grows, so do the risks of sharing too much personal information. A recent column in The Washington Post warned consumers about five specific types of data that should never be typed into an AI chatbot. The advice matters because these conversations are stored, analyzed, and in some cases, can be accessed by third parties.
Here is what you need to know about the risks and how to protect your money.
What happened: The growing concern over chatbot data safety
AI chatbots are trained on the text you give them. Many platforms keep chat logs to improve their models or for security reviews. In February 2026, a BBC reporter demonstrated how ChatGPT and Google’s AI could be hacked in about 20 minutes, exposing how easily prompts and responses can be extracted. Meanwhile, the National Council on Aging highlighted that older adults are especially vulnerable to financial scams that originate from social engineering based on personal details. NerdWallet also warned that consumers should not rely on chatbots for personal finance decisions without caution.
These incidents show that even if a chatbot’s responses seem harmless, the information you provide can be misused.
Why it matters: Your money and identity at risk
The most obvious risk is financial loss. Scammers can use your bank account numbers, Social Security number, or login credentials to drain accounts or open new credit lines in your name. But there are subtler risks too. Chatbot systems can be compromised, and even encrypted chats may not be fully private. If a service suffers a breach, your conversation history becomes part of the leaked data. And because many people treat chatbots like a search engine, they often forget that the answers are not confidential.
What readers can do: 5 things to never tell a chatbot
Follow these five rules to keep your finances safe while still using chatbots productively.
1. Full bank account or credit card numbers
Even if a chatbot asks for your card number to help with a transaction, never type the digits. No legitimate financial service will ask for sensitive data via a chatbot. If you need help with a purchase, copy the item name and price, not your payment info.
2. Your Social Security number or tax ID
This is the master key to your financial life. Scammers who obtain your SSN can file false tax returns, open accounts, or apply for loans. The Washington Post column emphasizes that no AI chatbot needs your SSN to answer a question.
3. Login credentials for financial accounts
Never paste your username, password, or security question answers into a chatbot. If you’re trying to reset a password, use the official website or app—not an AI assistant. Some chatbots have been shown to inadvertently repeat credential-like strings when trained on leaked data.
4. Detailed investment portfolios or trade secrets
If you discuss your holdings, trades, or upcoming mergers, you are handing over information that could be scraped for market manipulation or insider trading accusations. The NerdWallet article warns that chatbots often give generic financial advice, but sharing specifics opens you to unnecessary risks.
5. Real-time location or travel plans
Telling a chatbot that you’ll be away from home next week can alert thieves to an empty house. Combined with other details, scammers can also use location data to impersonate you or target you with physical scams.
Beyond these five, a general practice is to treat every chatbot conversation as if it might be read by a stranger. Use private or incognito sessions when possible, delete chat histories regularly, and adjust the service’s privacy settings to limit data retention.
Sources
- The Washington Post: “Don’t tell your AI chatbot these 5 things to keep your money safe” (April 2026)
- BBC: “I hacked ChatGPT and Google’s AI - and it only took 20 minutes” (February 2026)
- NerdWallet: “Should You Use AI for Personal Finance?” (February 2026)
- National Council on Aging: “Top 5 Financial Scams Targeting Older Adults” (March 2026)