5 Things You Should Never Tell an AI Chatbot (To Protect Your Money)
AI chatbots have become a regular part of daily life. People use them to draft emails, summarize articles, brainstorm recipes, and even get financial advice. But as these tools spread, so do the risks of sharing too much.
A recent column in The Washington Post (April 2026) warned that many users are unknowingly handing over sensitive financial data to chatbots. The White Coat Investor also investigated the finance risks of talking to ChatGPT about money. And in September 2025, a U.S. Senate hearing examined real harm caused by AI chatbots, including cases that led to financial loss. The National Council on Aging has flagged that scams targeting older adults are becoming more sophisticated with AI.
None of this means you should stop using chatbots. It does mean you need to be careful about what you type into them.
Here are five categories of information you should never share with any AI chatbot.
1. Account Numbers and Passwords
This includes bank account numbers, credit card numbers, debit card details, and any login passwords or PINs. It might seem obvious, but people sometimes paste account numbers into chatbots when asking for help budgeting or disputing a transaction.
Chatbot conversations are stored by companies, and data breaches have exposed user chat logs in the past. Even if the company promises encryption, you have no control over how that data is handled later. Once your account number is out there, it can be used for unauthorized transactions or identity theft.
2. Your Full Social Security Number or Tax ID
Never type your Social Security number, national insurance number, or taxpayer ID into a chatbot. This number is the key to your financial identity. In the wrong hands, it can be used to open loans, file fake tax returns, or take over existing accounts.
Some chatbots now offer document analysis features, and users have tried uploading W-2s or tax returns. That is extremely risky. Even if the chatbot service is legitimate, your data might be used to train future models or leaked in a breach.
3. Answers to Security Questions
Security questions like “What was your mother’s maiden name?” or “What was the name of your first pet?” are common verification methods for banks and credit cards. If you tell a chatbot that information, you have essentially handed over the answer to anyone who gets access to the chat.
It is easy to forget that chatbots remember context. You might chat casually about your childhood dog, and later a scammer could use that detail to reset your password. Better to treat all such personal history as confidential.
4. Personal Contact Details
Your full home address, phone number, and primary email address are pieces of data that, when combined with other leaks, can enable phishing attacks or physical theft. Scammers can use your address and phone number to impersonate your bank or send fake alerts.
Some people ask chatbots to draft a letter that includes their return address. That is usually fine for a single use, but avoid storing that information in the chat history. If a chatbot service is compromised, those details can be scraped.
5. Scans of IDs, Passports, or Financial Documents
Many chatbots now accept image uploads. Users have tried uploading driver’s licenses, passports, mortgage statements, and bank statements. This is extremely dangerous. A scanned ID contains all the information needed for identity fraud: full name, date of birth, photo, address, and often a signature.
There is no practical reason to upload such a document to a general-purpose chatbot. If you need help understanding a bank statement, obscure the account numbers and personal details first. Better yet, use a dedicated offline tool or talk to a human.
What to Do If You Have Already Shared Something Sensitive
If you realize you have typed a bank account number, Social Security number, or other sensitive data into a chatbot, take these steps:
- Change any passwords or PINs associated with that information.
- Contact your bank or credit card company and ask for an alert to be placed on your account.
- Consider freezing your credit if your Social Security number was exposed.
- If you used a free chatbot service, check whether you can delete your chat history. Many services allow you to clear conversation logs, but that does not guarantee the data is gone from their servers.
- Monitor your accounts closely for unusual activity.
Use Chatbots Cautiously
AI chatbots are useful tools, but they are not private vaults. Treat any chat the way you would treat a conversation in a public coffee shop—except that this one gets recorded and stored. Keep your financial and personal details to yourself. When in doubt, don’t type it.
Sources
- The Washington Post, “Don’t tell your AI chatbot these 5 things to keep your money safe,” April 2026.
- The White Coat Investor, “Is Talking to ChatGPT About Finance Ever a Good Idea?” June 2025.
- U.S. Senate hearing transcript, “Examining the Harm of AI Chatbots,” Tech Policy Press, September 2025.
- National Council on Aging, “Top 5 Financial Scams Targeting Older Adults,” March 2026.