Privacy Groups Warn HUD’s AI Tool Would Put Your Sensitive Data at Risk

The U.S. Department of Housing and Urban Development is considering an artificial-intelligence tool that would analyze sensitive personal data—including income, rental history, and benefit records—to detect fraud and determine program eligibility. Two leading digital-rights organizations, the Electronic Privacy Information Center (EPIC) and the Center for Democracy & Technology (CDT), have jointly urged HUD to abandon the project, warning that it lacks basic transparency and accountability measures. If you receive housing assistance or rent in federally subsidized housing, this matters to you.

What Happened

On May 6, 2026, EPIC and CDT sent a public letter to HUD Secretary opposing the agency’s proposed AI system. According to the letter, the tool would ingest data from multiple government databases—such as income reported to the IRS, Social Security records, and local housing authority files—and use machine-learning algorithms to flag potential fraud or changes in eligibility. HUD has described the tool as a way to modernize oversight and reduce improper payments, but the privacy groups argue that the system is being developed without a meaningful public impact assessment, no clear explanation of how decisions are made, and insufficient safeguards against errors or misuse.

The letter specifically criticizes HUD for not publishing a formal privacy impact assessment before moving forward. EPIC and CDT also note that similar AI tools at other agencies, including the Social Security Administration and the IRS, have faced legal challenges over accuracy, bias, and data handling.

Why It Matters

Government use of AI to process sensitive personal data is not new, but the scale and sensitivity of the data HUD intends to use raise several distinct concerns.

Lack of transparency. It is unclear how HUD’s algorithm would weigh different data points, what thresholds trigger alerts, or how individuals could challenge an automated decision. If the system incorrectly flags someone as ineligible or as committing fraud, the person may not know why or even that an automated decision was made.

Potential for bias. Machine-learning models trained on historical housing data can replicate existing patterns of discrimination, including those based on race, income, or disability status. Without independent testing, there is no guarantee the tool would treat all applicants fairly.

Data security and scope. Aggregating income, rental history, and benefit records into a single system creates a large, attractive target for data breaches. It also raises questions about how long data would be kept, who else might access it, and whether the tool could eventually be used for purposes beyond fraud detection—such as immigration enforcement or law enforcement.

Precedent for other agencies. If HUD proceeds with minimal safeguards, other federal and state agencies may follow suit, using similar AI tools with even broader data sets and fewer protections. The EPIC–CDT letter frames this as a critical moment to establish baseline privacy requirements for administrative AI.

The groups are not arguing that HUD should never use technology to improve efficiency. They are arguing that any such system must be transparent, subject to public comment and independent testing, and include a clear appeals process for people affected by its decisions.

What Readers Can Do

If you are concerned about the use of AI to process your housing data, there are a few practical steps you can take.

  • Submit a public comment. HUD may still be accepting comments on this proposal through the federal rulemaking process. Check HUD’s docket at Regulations.gov by searching for “HUD AI” or the relevant docket number (often listed in press releases from EPIC or CDT). Public comments are an effective way to press for transparency and accountability.

  • Contact your representatives. A short email or phone call to your U.S. Representative and Senators can help elevate the issue. You can mention that privacy groups have raised concerns and ask them to request a formal privacy impact assessment before HUD deploys the tool.

  • Stay informed. Follow organizations like EPIC and CDT for updates. They often share opportunities for public input and summaries of government agency actions.

  • Review your own data. If you receive housing assistance, you have the right to request your records from your local housing authority. Familiarize yourself with what data is held and who it might be shared with. You can also check HUD’s privacy notices to see if the agency has disclosed plans for this AI tool.

Sources

  • EPIC, “EPIC, CDT Urge HUD to Abandon Proposed AI Tool That Would Use Sensitive Data,” May 6, 2026. [Link to EPIC press release]
  • Center for Democracy & Technology, joint letter to HUD Secretary, May 2026.
  • HUD proposed AI tool description (available via Regulations.gov, docket number to be confirmed).

Note: The specific docket number for public comments was not available at the time of writing. Check Regulations.gov or EPIC’s website for the exact link.