5 things you should never tell your AI chatbot to protect your money

AI chatbots can be convenient for quick answers on budgeting, shopping, or investment questions. But the more personal details you share, the greater the risk that your financial information could end up in the wrong hands — either through a data breach, a scammer manipulating the chatbot, or simply because the platform stores your conversations indefinitely.

Recent reporting by The Washington Post and investigations by outlets like the BBC have highlighted how easily chatbots can be tricked into revealing information or how user data can be exposed. Below are five categories of information you should avoid sharing with any AI chatbot, along with safer alternatives.

What happened

In a recent column, The Washington Post warned that sharing certain details with AI assistants could put users at risk of financial fraud and identity theft. The advice is not hypothetical: researchers have shown that chatbots can be prompted to output sensitive data they were trained on, and security flaws have allowed hackers to extract user conversations. Meanwhile, the National Council on Aging has identified chatbot-assisted scams as a growing threat, particularly for older adults.

Why it matters

AI chatbots are not designed to be secure vaults for your personal information. Most major services store and analyze conversations to improve their models. If a service suffers a breach — or if you interact with a malicious chatbot posing as a legitimate one — the data you entered can be used to impersonate you, answer security questions, or target you with personalized scams. Unlike a password manager or encrypted notes app, a chatbot does not offer end-to-end encryption for your chat history.

What you can do

Follow these five rules to minimize the financial risk of using AI chatbots.

1. Social Security numbers and government IDs

Never enter your full Social Security number, tax ID, driver’s license number, or passport details. These are the keys to identity theft. Even if you trust the chatbot’s privacy policy, data can be leaked or subpoenaed. Instead, keep such numbers stored in a password manager or an encrypted document offline.

2. Bank account and credit card numbers

Avoid typing full account numbers, routing numbers, or credit card details. A chatbot might ask for them if you’re seeking personalized financial advice, but legitimate financial tools should never require you to paste sensitive numbers into a chat interface. Use your bank’s official app or website for transactions and account inquiries.

3. Login credentials and security answers

Do not share your passwords, PINs, or answers to common security questions (e.g., your mother’s maiden name, the street you grew up on, or your first pet’s name). These are exactly the pieces of information that scammers use to reset your accounts. A chatbot has no way to encrypt or protect this data as a dedicated password manager does.

4. Personal identity verification details

Your full birth date, home address, and phone number may seem harmless, but in combination they are enough to open credit lines or file fraudulent tax returns. If you need location-specific advice, you can give a general region (“Southeast U.S.”) instead of your exact address. For date calculations, provide only the month and year if necessary.

5. Detailed financial situation and assets

Describing your complete income, savings, investments, and debts in detail can be useful for planning, but it also creates a valuable profile for scammers. They can use that information to craft convincing phishing emails that reference your exact holdings or loan amounts. Instead, ask generic questions (“What are strategies for paying off debt?”) without attaching your personal numbers.

Sources

  • The Washington Post, “Don’t tell your AI chatbot these 5 things to keep your money safe” (April 2026)
  • BBC, “I hacked ChatGPT and Google’s AI – and it only took 20 minutes” (February 2026)
  • NerdWallet, “Should You Use AI for Personal Finance? What to Consider and What to Avoid” (February 2026)
  • National Council on Aging, “Top 5 Financial Scams Targeting Older Adults and How to Avoid Them” (March 2026)