Privacy Groups Warn: HUD’s AI Tool Could Expose Your Sensitive Data – What to Know

If you receive housing assistance from the U.S. Department of Housing and Urban Development (HUD), an experimental AI system could soon analyze your personal information – without clear privacy safeguards. That’s why two leading digital rights organizations, the Electronic Privacy Information Center (EPIC) and the Center for Democracy & Technology (CDT), are urging HUD to abandon the tool before it goes live.

What Happened

On May 6, 2026, EPIC and CDT sent a joint letter to HUD calling on the agency to halt its proposed AI tool. According to EPIC’s press release, the system would mine data from HUD’s housing assistance programs – including financial records, medical information, and housing history – to predict outcomes or automate decisions. The groups argue that HUD has not provided enough detail about how the AI works, what data it uses, or how it will protect people’s privacy.

The exact scope of the tool is still unclear. HUD has not published a full privacy impact assessment or opened the system for public comment in a meaningful way. EPIC and CDT say this lack of transparency is a red flag, especially when the data involved is already considered sensitive under federal law.

Why It Matters for Consumers

For millions of Americans who rely on HUD programs – such as Section 8 vouchers, public housing, or rental assistance – this isn’t a theoretical issue. The data at risk could include:

  • Income and employment details used to determine eligibility.
  • Medical or disability information submitted as part of housing applications.
  • Eviction history and other records that could affect future housing opportunities.
  • Personally identifiable information like Social Security numbers and addresses.

If an AI system mishandles such data, the consequences could be serious: denial of benefits, increased scrutiny, or discrimination without clear recourse. Privacy advocates also worry about the precedent this sets. Once a federal agency deploys a data-hungry AI tool, it may be difficult to roll back, and other agencies could follow suit with similar systems.

The groups also point out that AI tools in housing decisions have a documented history of bias. Without independent oversight and civil rights testing, a flawed algorithm could harm the very people HUD is supposed to help.

What You Can Do

Even though the decision rests with HUD, consumers have options for making their voices heard:

  1. Stay informed. Follow EPIC, CDT, and other consumer advocacy organizations for updates on AI use in federal agencies. Their websites and newsletters often provide concise summaries of pending proposals.

  2. Contact your representatives. Your U.S. House member and senators can ask HUD to pause the tool or demand a public comment period. A short email or phone call referencing “HUD’s proposed AI data tool” can help raise the issue.

  3. Review your rights. If you are a HUD program participant, you have certain privacy protections under the Privacy Act and the Fair Housing Act. Report any unusual requests for data or unexpected changes in your benefits to a legal aid provider or HUD’s Office of Fair Housing and Equal Opportunity.

  4. Support stronger oversight organizations. Nonprofits like EPIC and CDT rely on public support to keep advocating for transparency. Even a small donation or sharing their content helps amplify their message.

Sources

  • EPIC press release, “EPIC, CDT Urge HUD to Abandon Proposed AI Tool That Would Use Sensitive Data,” May 6, 2026. EPIC website (original link available via Google News archive).