Don’t Tell Your AI Chatbot These 5 Things to Keep Your Money Safe

AI chatbots like ChatGPT, Gemini, and Copilot are convenient tools for quick answers, drafting messages, or brainstorming ideas. More people are also turning to them for financial questions: budgeting tips, investment basics, or even help with tax forms. But that convenience comes with real privacy and security risks. Here’s what you should never share with an AI chatbot if you want to keep your money safe.

What happened

Over the past year, cybersecurity researchers have demonstrated that chatbots can be tricked into revealing information they were trained not to share. In one widely reported test, a researcher managed to extract personal data from ChatGPT and Google’s AI in about 20 minutes using a technique called prompt injection. The BBC documented that experiment in early 2026.

More broadly, a recent column in The Washington Post highlighted five specific categories of financial information that users should never type into a chatbot. The concern isn’t just about the current interaction—chat histories are often stored by providers, and if that data is breached, your sensitive details could end up in the wrong hands.

Why it matters

Financial account numbers, Social Security numbers, passwords, PINs, and detailed transaction information are the raw materials for identity theft and fraud. Scammers don’t need to break into a bank’s server if they can find your login credentials in a leaked chat log or trick you into entering them into a fake interface.

Even when chatbots are secure, the companies behind them may use conversation data for model training or share it with third parties under certain policies. The privacy settings you assume are in place may not be as strict as you think. And because chatbots are designed to be helpful, they may respond to follow-up questions that piece together fragments of your financial life.

What readers can do

The advice is straightforward: treat AI chatbots like a stranger on the internet. You wouldn’t hand your bank card to a random person in a coffee shop, so don’t hand over the equivalent information to a chatbot. Here are the five things to avoid sharing:

  1. Full financial account numbers – Bank accounts, credit cards, investment accounts. A chatbot doesn’t need these to give you general advice, and if a service asks for them, it’s likely a scam.

  2. Social Security or taxpayer ID numbers – Never type these into a general-purpose chatbot. Even if you’re asking about tax forms, use official IRS or government tools designed for that purpose.

  3. Passwords and security credentials – Obvious, but repeated often because people still do it. A chatbot cannot securely store or rotate your passwords.

  4. Specific transaction details – Don’t describe a recent purchase, transfer, or payment in a way that includes amounts, dates, or merchant names. That information can be used to impersonate you or confirm fraud.

  5. Personal identification numbers (PINs) – Whether for your debit card, phone, or safe, PINs should remain in your memory, not in a chat log.

Beyond these five, a good rule of thumb is to keep any conversation about your personal finances generic. Ask “What factors affect credit scores?” rather than “Can I get approved for a loan with my $45,000 income and 680 score?” Use official banking apps or websites for actual transactions and for viewing account balances. Several personal finance experts at NerdWallet have also advised caution when using AI for financial planning, noting that chatbots are not regulated advisors and may provide incorrect or misleading information.

Finally, review your chatbot settings. Most platforms allow you to delete past conversations or turn off chat history. Enabling those options reduces the risk of your data lingering on servers.

Sources

  • The Washington Post, “Column: Don’t tell your AI chatbot these 5 things to keep your money safe” (April 2026)
  • BBC, “I hacked ChatGPT and Google’s AI - and it only took 20 minutes” (February 2026)
  • NerdWallet, “Should You Use AI for Personal Finance? What to Consider and What to Avoid” (February 2026)
  • The National Council on Aging, “Top 5 Financial Scams Targeting Older Adults and How to Avoid Them” (March 2026)