Keep Your Money Safe: Five Things Never to Tell Your AI Chatbot

If you use ChatGPT, Gemini, Copilot, or any other AI chatbot for daily tasks, you’re not alone. These tools are incredibly convenient for drafting emails, summarizing documents, or brainstorming ideas. But that convenience comes with a privacy catch: anything you type into a chatbot can be stored, analyzed, and in some cases, exposed to third parties. As more people turn to AI for help with budgeting, shopping, or even tax prep, the risk of oversharing sensitive financial information has grown sharply.

Recent reporting in The Washington Post and investigations by the BBC show that AI chatbots can be hacked or manipulated to leak user data. A BBC journalist demonstrated in early 2026 that it took only 20 minutes to compromise both ChatGPT and Google’s AI. Meanwhile, NerdWallet’s finance experts warn that using AI for personal finance without careful boundaries can lead to identity theft and fraud. The National Council on Aging also lists AI-related scams as a growing threat to older adults.

Why It Matters

The core problem is simple: chatbots are not your private confidants. Most services log your conversations for model training, and even “encrypted” chats may be stored on company servers. If a chatbot is compromised, your conversation history can become a goldmine for identity thieves. Scammers are already using AI to craft convincing phishing messages, and if they obtain real answers to security questions or your account numbers, they can drain your bank account or open credit in your name before you notice.

What You Can Do – Five Rules for Safer Chatting

1. Never share passwords, PINs, or two-factor authentication codes.
This might seem obvious, but many users paste a temporary code into a chatbot to ask for help. Once that code leaves your phone, it can be stored or intercepted. Treat any verification code as you would a key to your front door – never hand it over to a machine you don’t fully control.

2. Keep bank account and credit card numbers out of conversations.
Even if a chatbot offers to help you categorize spending or set a budget, never type your full account number. Many people have lost money after a chatbot was compromised and the chat history was used to make fraudulent transactions. Use a dedicated budgeting app that stores sensitive data locally or with strong encryption – not a general-purpose chatbot.

3. Do not disclose your Social Security number or tax ID.
Your Social Security number is the crown jewel for identity thieves. No legitimate chatbot needs it to help you with a task. If a chatbot asks for it – for example, pretending to be a tax assistant – stop immediately. Report the interaction to the service provider.

4. Avoid providing answers to common security questions.
Questions like “What is your mother’s maiden name?” or “What was your first pet’s name?” are used by banks and email providers to reset passwords. If you tell a chatbot those answers, you’ve effectively handed over the keys to your account recovery. If you need to remember that information, store it in a password manager, not in a chat log.

5. Don’t share your home address or real-time location.
While it might seem harmless to ask for restaurant recommendations near you, revealing your precise address can enable targeted scams. Scammers can combine location data with other leaked information to make calls or send letters that appear legitimate. Keep location sharing turned off in chatbot apps unless absolutely necessary, and never paste your full address into a conversation.

The Bottom Line

Chatbots are useful tools, but they are not secure vaults. Treat every conversation as if it could be read by a stranger tomorrow. If you’re unsure whether a piece of information is safe to share, err on the side of caution – keep it offline. Your financial security is worth a little extra inconvenience.

Sources

  • The Washington Post, “Don’t tell your AI chatbot these 5 things to keep your money safe” (April 2026)
  • BBC, “I hacked ChatGPT and Google’s AI – and it only took 20 minutes” (February 2026)
  • NerdWallet, “Should You Use AI for Personal Finance? What to Consider and What to Avoid” (February 2026)
  • National Council on Aging, “Top 5 Financial Scams Targeting Older Adults and How to Avoid Them” (March 2026)