Don’t Tell Your AI Chatbot These 5 Things to Protect Your Money

Introduction

Chatbots based on large language models—like ChatGPT, Google Gemini, or Microsoft Copilot—have become a go‑to for quick answers, creative writing, and even casual conversation. But as their use has soared, so have concerns about what users are unwittingly handing over. A recent column in The Washington Post warns that revealing certain types of information to an AI assistant can put your finances at risk. This is not about the chatbots malevolently stealing data; it’s about how shared information can be stored, accessed by others, or used to impersonate you. Below is a practical look at what you should avoid telling any AI chatbot, and why.

What happened

In April 2026, The Washington Post published a column titled “Don’t tell your AI chatbot these 5 things to keep your money safe.” The piece draws attention to a growing blind spot among consumers: many people treat chatbots like a private diary or a financial advisor, when in reality the conversation logs may be retained by the provider, exposed in data breaches, or even used to train future models. Around the same time, a BBC reporter demonstrated how, with some clever prompting, they could trick ChatGPT and Google’s AI into revealing information that should have been protected—all within 20 minutes. That kind of vulnerability isn’t hypothetical; it points to real risks when you share sensitive numbers or documents.

Why it matters

The average person is not a security researcher. You might ask a chatbot to draft a budget, verify a bank balance, or suggest investment moves. But chatbots are not built to handle confidential data with the same safeguards as a bank or tax preparer. Many AI services store your conversations to improve performance, and those logs could be subpoenaed or leaked. Even if the provider has strong protections, there is always the risk that an attacker can trick the chatbot into revealing what you typed—as the BBC experiment showed.

Sharing your Social Security number, bank account details, or full tax documents with a chatbot could lead to identity theft, account takeover, or financial scams. The National Council on Aging notes that older adults are especially targeted, but younger users are not immune. And because chatbots can confidently produce wrong advice, relying on them for specific financial decisions can cost you money.

What readers can do

The core advice comes down to five “don’ts,” each backed by the reasoning that any data you share with an AI assistant is data that could eventually leave your control. Here they are, with a little more context.

1. Never share your Social Security number or tax ID.
Even if the chatbot is “helping” you with a tax question or a benefit application, don’t type in that nine‑digit number. The same goes for your driver’s license number or passport number. These are the keys to your identity, and once they are in a chatbot’s logs, you lose control.

2. Avoid entering bank account numbers, credit card details, or full passwords.
You might be tempted to ask a chatbot to “remind” you of your routing number or to store a credit card for future reference. Resist. If the chatbot’s service is breached, that data could be exposed. Even partial numbers can be dangerous. Passwords, especially, should never be shared with any third party.

3. Do not rely on the chatbot for specific, personalized financial advice.
Chatbots can give general tips, but they are not certified financial planners. They can make arithmetic errors, operate from outdated information, or misunderstand your situation. NerdWallet’s analysis of AI for personal finance cautions that the advice is often too generic and sometimes flat‑out wrong. Use the chatbot for education, but verify with a human professional before making moves.

4. Don’t upload sensitive documents like bank statements, tax returns, or pay stubs.
Some chatbots let you upload PDFs for analysis. That convenience is tempting, but you are effectively handing over a copy of your most private financial records. Even if the service promises encryption, the risk of a breach or misuse remains. Keep those files in a secure drive or with a trusted tax preparer.

5. Beware of fake chatbots designed to steal information.
Scammers are now creating “customer support” chatbots that impersonate banks, utilities, or government agencies. You might encounter one on a random website or in a phishing email. Always verify the URL and the service before typing anything. If a chatbot pops up and asks for your account number out of the blue, it is almost certainly a trap.

Sources

  • The Washington Post (April 2026): “Don’t tell your AI chatbot these 5 things to keep your money safe.”
  • BBC (February 2026): “I hacked ChatGPT and Google’s AI – and it only took 20 minutes.”
  • The National Council on Aging (March 2026): “Top 5 Financial Scams Targeting Older Adults and How to Avoid Them.”
  • NerdWallet (February 2026): “Should You Use AI for Personal Finance? What to Consider and What to Avoid.”