Don’t Tell Your AI Chatbot These 5 Things to Keep Your Money Safe

AI chatbots like ChatGPT, Google Gemini, and Microsoft Copilot have become part of daily life for millions of people. They help draft emails, summarize articles, and answer questions. But their convenience hides a real financial risk: every piece of information you type into a chatbot could be stored, shared, or used against you. A recent column in The Washington Post highlighted five specific things you should never share with an AI chatbot if you want to keep your money safe. Here’s what you need to know.

What Happened

The April 2026 Washington Post column (reported by Reuters and other outlets) laid out a growing problem: AI chatbots are not designed to handle sensitive financial data the way your bank’s app or encrypted messaging service is. Unlike a password manager or a dedicated banking platform, most consumer chatbots store your conversations, use them to train future models, and can be vulnerable to “prompt injection” attacks—where an outsider tricks the bot into revealing what you typed earlier.

Separately, the BBC reported in February 2026 that a security researcher hacked ChatGPT and Google’s AI in just 20 minutes using a simple prompt injection technique. The National Council on Aging (NCOA) has also warned about AI-driven financial scams targeting older adults, and NerdWallet published guidance in February 2026 questioning whether consumers should use AI for personal finance tasks at all.

Why It Matters

The core problem is trust without understanding. Most users assume an AI chatbot is as private as a search engine. It’s not. Your conversation history may be retained by the company, and in some cases, it can be recalled or leaked through bugs or attacks.

Scammers are also getting smarter. A chatbot that knows your bank name, your financial concerns, or details from a tax document can be a goldmine for targeted phishing messages. A fraudster could send you an email that references something you asked the chatbot about, making the scam feel personal and credible. The risk isn’t just that the bot itself will steal from you; it’s that your data could be used to manipulate you later.

What Readers Can Do

Based on the Washington Post column and supporting research, here are the five things you should never tell a chatbot:

1. Your bank account numbers, credit card details, or Social Security number.
No legitimate financial task requires you to paste your account number into a chat interface. If a chatbot asks for it, stop. Your bank’s official app or website is the only place to handle such data.

2. Passwords or security answers.
Never type a password, PIN, or the answer to a security question (like your mother’s maiden name) into a chatbot. Even if the bot is just helping you “organize your passwords,” the information can be stored indefinitely.

3. Access to your financial accounts.
Do not give a chatbot “read-only” access to your bank or investment accounts via plugins or integrations. It’s unclear how those permissions are protected. A better approach: use your bank’s own tools or a dedicated, audited budgeting app.

4. Uploaded documents like tax returns, bank statements, or pay stubs.
Many chatbots now accept file uploads. A tax return contains enough personal data (name, address, employer, income, dependents) to enable identity theft. Upload only if you are using a tool specifically designed for secure document processing, and even then, verify its privacy policy.

5. Personal identifiers that can be used to impersonate you.
Full name combined with date of birth, home address, or driver’s license number is enough for fraudsters to open accounts or reset passwords. If you need help with a document that contains such info, redact the sensitive parts first.

Beyond the five items, consider a simple rule: if you would not post the information on a public forum, do not type it into a chatbot. Use encrypted notes apps or password managers for sensitive data, and stick to your bank’s official channels for financial tasks. For general advice, treat the chatbot as a helpful but leaky companion—useful for brainstorming, not for storing secrets.

Sources

  • Washington Post column (April 2026) on five things not to tell an AI chatbot.
  • BBC article: “I hacked ChatGPT and Google’s AI - and it only took 20 minutes” (February 2026).
  • NCOA: “Top 5 Financial Scams Targeting Older Adults and How to Avoid Them” (March 2026).
  • NerdWallet: “Should You Use AI for Personal Finance?” (February 2026).
  • Wall Street Journal: “We Let AI Run Our Office Vending Machine. It Lost Hundreds of Dollars” (December 2025).