Never Tell Your AI Chatbot These 5 Financial Details
A growing number of people use AI chatbots like ChatGPT, Google Gemini, or Microsoft Copilot for everyday tasks—drafting emails, planning meals, or answering quick questions. But that convenience comes with a real risk: oversharing sensitive financial information. Recent reporting from The Washington Post and independent security research highlight why you should keep certain details out of your chat history entirely.
What Happened
The Washington Post ran a column on April 25, 2026, outlining five categories of personal financial data that should never be shared with AI chatbots. The piece follows a BBC investigation published in February 2026, where a security researcher demonstrated they could extract sensitive information from ChatGPT and Google’s AI in about 20 minutes using relatively simple techniques. The BBC test showed that even anonymized conversations might be reconstructable by a determined attacker who gains access to the chat log. Meanwhile, NerdWallet’s February 2026 guide on using AI for personal finance cautioned that many people treat chatbots as private advisors without realizing that the companies behind them can review conversations for training purposes, and that those conversations may not be encrypted end-to-end.
Why It Matters
AI chatbot conversations are stored on company servers, and privacy policies vary. Some platforms allow employees or contractors to review chat logs to improve models. Others may share data with third parties for analytics. If a scammer gains access to your account—or if a data breach occurs—your chat history could become a goldmine. The National Council on Aging (NCOA) lists financial scams targeting older adults as a top threat, and phishing attacks that reference details from your private conversations are becoming more common. A user in one documented incident had their chat history accessed after reusing a password on a compromised site; the attacker then used context from past exchanges to craft a convincing fraud attempt.
The Five Things Never to Share
Based on the Washington Post column, security research, and privacy guidelines, here are the types of information you should keep out of chatbot conversations entirely.
1. Full account numbers – Do not paste checking, savings, credit card, or investment account numbers. Even if you think you’re just asking for help categorizing a transaction, the number remains in the log. A stolen account number, combined with other details, can enable unauthorized transfers or fraudulent purchases.
2. Social Security or tax ID numbers – These are the keys to your identity. In the wrong hands, they can be used to open accounts, file fraudulent tax returns, or commit medical identity theft. No legitimate AI tool needs your Social Security number to answer a general question about tax deductions.
3. Login credentials and security answers – Never type your username, password, PIN, or answers to security questions like “What was your first pet’s name?” into a chatbot. If the conversation is ever exposed, an attacker can try those credentials on your bank, email, or other services. Many people reuse passwords, making this even riskier.
4. Real-time transaction details – Avoid sharing specifics about a pending wire transfer, a large purchase, or a confirmation code you just received via text. A scammer who sees a recent transaction could craft a follow-up phishing message that looks legitimate—for example, pretending to be the bank with a “verify this payment” request.
5. Sensitive documents – Avoid uploading scans of tax returns, bank statements, or signed contracts. Even if the chatbot promises not to store the file, metadata and extracted text may persist. The BBC researcher was able to extract text from uploaded PDFs by asking the chatbot to repeat part of the content it had “forgotten.”
What Readers Can Do
If you have already shared any of the above information in a chatbot, take these steps:
- Delete the conversation. Most platforms let you clear chat history. In ChatGPT, go to Settings > Data Controls > Delete all conversations. In Google Gemini, you can delete activity from your account settings. Doing this removes the text from your visible history, though it may not remove it from all backup systems.
- Monitor your financial accounts. Review recent transactions for anything unusual. Set up alerts for large withdrawals or changes to contact information.
- Consider a credit freeze. If you shared a Social Security number or tax ID, freezing your credit at the three major bureaus (Equifax, Experian, TransUnion) prevents new accounts from being opened in your name. It’s free and can be lifted temporarily if needed.
- Change passwords and security questions for any account you may have described in a chat. Use a password manager to generate strong, unique passwords.
For future use, keep your queries generic. Instead of asking “What should I do with the $5,000 in my Chase checking account ending in 1234?” ask “What are some general strategies for investing a lump sum of a few thousand dollars?” If you need personalized financial advice, use a dedicated financial planning tool or talk to a human advisor—not a general-purpose chatbot.
Sources
- Washington Post. “Column: Don’t tell your AI chatbot these 5 things to keep your money safe.” April 25, 2026.
- BBC. “I hacked ChatGPT and Google’s AI – and it only took 20 minutes.” February 18, 2026.
- NerdWallet. “Should You Use AI for Personal Finance? What to Consider and What to Avoid.” February 27, 2026.
- National Council on Aging. “Top 5 Financial Scams Targeting Older Adults.” March 17, 2026.