5 things you should never tell a chatbot if you want to keep your money safe
AI chatbots like ChatGPT, Google Gemini, and Meta AI have become everyday tools for drafting emails, planning meals, and even managing personal finances. But as their popularity has grown, so have concerns about what happens to the information you share. Recent reports, including a Washington Post column on chatbot safety, alongside a BBC investigation that showed how easily a researcher could extract data from these systems, make one thing clear: chatbots are not private vaults.
Here’s what you should avoid telling them—and what to do instead.
What happened
In early 2026, security researchers demonstrated that they could trick chatbots into revealing sensitive information they had absorbed during training. The BBC reported that a researcher managed to extract personal data from both ChatGPT and Google’s AI in just 20 minutes. Meanwhile, multiple consumer outlets have warned that users are casually sharing bank account numbers, Social Security numbers, and login credentials with chatbots—often without realizing that those conversations may be stored, reviewed, or used to improve the models.
Many chatbots also retain chat history by default, and some companies have been transparent about using anonymized user data for training. But “anonymized” does not always mean completely safe, especially when enough details can be pieced together.
Why it matters
A single careless prompt could leak information that puts your finances at risk. If a bad actor gains access to your chat history—via a data breach, phishing attack, or even a subpoena—they could see your account numbers, family details, or answers to security questions. Some chatbots are also vulnerable to prompt injection attacks, where malicious instructions are hidden in third-party content that the chatbot processes.
The National Council on Aging (NCOA) recently highlighted that older adults are increasingly targeted by scams that involve fake AI assistants or chatbot impersonations. If you have already shared sensitive data with a legitimate chatbot, you may be more likely to fall for a scam that refers to those details.
What you can do instead
Never share these five things with a chatbot
Full bank account or credit card numbers. Even if you are trying to ask for budgeting help, write in general terms. For example, say “I have a credit card with a $5,000 limit” rather than reading off the 16-digit number.
Your Social Security number or government ID. No chatbot needs this to answer financial questions. If you suspect you need it for identity verification, you are likely being phished.
Login credentials for any financial service. Never type your username or password into a chatbot, even if you are asking about a problem with an account. Use the official website or app’s support channels.
Investment strategies or inside information. Aside from privacy risks, chatbots can generate inaccurate financial advice. More importantly, if you share non-public information about a company, you could inadvertently create legal exposure.
Personal data about family members. Details like a spouse’s income, a child’s Social Security number, or a parent’s bank account should not be disclosed—even in a general conversation. Scammers can stitch together fragments from multiple chats.
Safer alternatives
- Use a dedicated password manager to store login credentials.
- For budgeting and financial planning, try offline tools like a spreadsheet or a locally installed program that does not send data to the cloud.
- If you need AI help with numbers, use a calculator feature that does not log history (or disable chat saving in the settings).
- Read the privacy policy of any chatbot you use. Look for options to delete past conversations and opt out of training data usage.
Sources
- Washington Post: “Don’t tell your AI chatbot these 5 things to keep your money safe”
- BBC: “I hacked ChatGPT and Google’s AI – and it only took 20 minutes”
- NerdWallet: “Should You Use AI for Personal Finance? What to Consider and What to Avoid”
- National Council on Aging: “Top 5 Financial Scams Targeting Older Adults and How to Avoid Them”
Stay cautious. A chatbot is a powerful tool, but it is not a confidant. Treat your financial data the same way you would treat a stranger on the street—share only what is absolutely necessary.