Don’t Tell Your AI Chatbot These 5 Things If You Want to Keep Your Money Safe
AI chatbots like ChatGPT, Gemini, and Claude have become everyday tools for drafting emails, summarizing documents, and even helping with financial questions. But as their popularity surges, so does the risk of inadvertently handing over information that can be used to steal your money or identity. A recent column in The Washington Post (April 25, 2026) highlighted five categories of data you should never share with a chatbot. Here’s what you need to know—and what to do instead.
What happened
Security researchers and consumer advocates have been sounding alarms about the risks of oversharing with AI assistants. A BBC investigation in February 2026 showed that a researcher could trick ChatGPT and Google’s Gemini into revealing sensitive information in under 20 minutes. Meanwhile, NerdWallet’s guide on using AI for personal finance warns that chatbots are not designed to handle confidential data securely—prompts are stored and can be reviewed by the companies or accessed in a breach.
The Washington Post column distilled expert guidance into a clear list: do not share your Social Security number, bank details, answers to security questions, scans of identification documents, or a combination of your full name, address, and birth date. Add to that the risk of sharing investment goals or portfolio details, which scammers can use for targeted fraud.
Why it matters
Most users assume their chat history is private. But many AI platforms log conversations to improve models, and some allow human reviewers to read them. If a data leak occurs—or if a scammer gains access to your account—the collected information becomes a goldmine.
For example, if you’ve told a chatbot your mother’s maiden name or the name of your first pet, that information can be used to reset passwords on bank or email accounts. Similarly, uploading a photo of your driver’s license or a tax return means you’ve handed over a complete identity kit. The National Council on Aging lists these exact tactics in its report on scams targeting older adults, noting that fraudsters increasingly incorporate data from AI conversations into their schemes.
Perhaps most troubling: chatbots can be tricked into revealing things they “remember” from past conversations, as the BBC demonstrated. Even if you think you’ve deleted a conversation, the company may still have logs.
What you can do: the 5 things not to share
Sensitive personal identifiers – Never type your Social Security number, bank account number, or full credit card details into a chatbot. If you need help tracking spending or organizing accounts, use a password manager or dedicated budgeting app instead.
Answers to common security questions – Avoid providing your mother’s maiden name, your pet’s name, your high school mascot, or any fact that banks and email providers use to verify your identity. A chatbot has no business storing these.
Scanned documents or photos – Do not upload copies of your passport, driver’s license, tax returns, or bank statements for “analysis” by an AI. The risk of a leak is real, and the data can be used for identity theft or to craft convincing phishing emails.
Combined personal details – Your name and address alone may not be dangerous, but adding your birth date and phone number creates a near-complete identity profile. Scammers can piece together enough to open accounts in your name. Keep these details out of chatbot chats.
Financial goals and investment details – Telling a chatbot about your retirement savings, your income, or your portfolio holdings might seem harmless, but it gives scammers material for highly personalized attacks. For instance, they could pose as a financial advisor and reference your specific goals. Stick to publicly available information if you must discuss hypothetical scenarios.
A quick checklist
- Review your recent chatbot conversations for any of the above.
- If you find sensitive data, delete the chat (but know the company may still retain logs—check their privacy policy).
- Use a dedicated, encrypted service for any financial calculations or document storage.
- Consider using a pseudonym when interacting with AI for non-critical tasks.
Sources
- Washington Post column, “Don’t tell your AI chatbot these 5 things to keep your money safe,” April 25, 2026.
- NerdWallet, “Should You Use AI for Personal Finance? What to Consider and What to Avoid,” February 27, 2026.
- BBC, “I hacked ChatGPT and Google’s AI – and it only took 20 minutes,” February 18, 2026.
- National Council on Aging, “Top 5 Financial Scams Targeting Older Adults,” March 17, 2026.
AI chatbots are powerful tools, but they are not vaults. Treat every prompt as something that could become public someday. Your money—and your identity—will thank you.