Stop Telling Your AI Chatbot These 5 Things – Your Bank Account Will Thank You

AI chatbots have become everyday tools for quick answers, writing help, and even financial advice. But as their use grows, so does the risk of oversharing sensitive details that can be turned against you. A recent Washington Post column highlighted five categories of information you should never disclose to a chatbot—advice that’s worth taking seriously.

What Happened

The column, published in late April 2026, drew on reports of data leaks and social engineering attacks that exploit chatbot conversations. It pointed to a BBC investigation where a researcher hacked ChatGPT and Google’s AI in just 20 minutes, demonstrating that these platforms are vulnerable. Meanwhile, NerdWallet and the National Council on Aging have warned that scammers are using AI to craft personalized phishing messages based on information users share with chatbots. The takeaway is clear: what you tell a chatbot can end up in the wrong hands.

Why It Matters

Your chatbot conversations may be stored, analyzed, or even accessed by third parties. Even if the platform promises encryption, no system is perfectly secure. If a hacker gets access to your chat history, they can gather enough personal data to answer security questions, impersonate you to your bank, or craft targeted scams. Older adults are especially at risk—financial scams targeting them have surged, and AI makes those scams more convincing. Protecting your money starts with knowing what not to say.

What Readers Can Do

Here are the five types of information you should never share with any AI chatbot, no matter how helpful it seems.

1. Sensitive identification numbers. Never type your Social Security number, tax ID, driver’s license number, passport number, or any PIN or password. These are the keys to your identity. A chatbot doesn’t need them, and if they’re compromised, recovery is a nightmare.

2. Bank and credit card details. Don’t tell a chatbot your account numbers, credit card numbers, or the three-digit CVV on the back. Even if you’re asking for help budgeting or tracking spending, use only general categories. The same goes for login credentials to your bank or investment accounts.

3. Answers to security questions. Avoid mentioning your mother’s maiden name, the street you grew up on, your first pet’s name, or any other common security question answer. Many banks and services use these for account recovery. If a chatbot remembers them, so can a hacker.

4. Full name, home address, and birth date. You may need to give these to legitimate services, but a chatbot is not one of them. If you’re using a chatbot to draft a letter or plan an event, use placeholder text instead of your real address or birthday.

5. Details about recent financial transactions or assets. Don’t tell a chatbot that you just received a large inheritance, sold a house, or transferred a big sum. Scammers monitor for such clues and can use them to devise targeted schemes—like pretending to be your bank calling about that transaction.

Sources

  • Washington Post, “Column | Don’t tell your AI chatbot these 5 things to keep your money safe” (April 25, 2026)
  • NerdWallet, “Should You Use AI for Personal Finance? What to Consider and What to Avoid” (February 27, 2026)
  • BBC, “I hacked ChatGPT and Google’s AI - and it only took 20 minutes” (February 18, 2026)
  • National Council on Aging, “Top 5 Financial Scams Targeting Older Adults and How to Avoid Them” (March 17, 2026)