The Department of Justice announced today that it has obtained a settlement agreement resolving allegations that Meta Platforms Inc., formerly known as Facebook Inc., has engaged in discriminatory advertising in violation of the Fair Housing Act (FHA).
The proposed agreement resolves a lawsuit filed today in the U.S. District Court for the Southern District of New York alleging that Meta’s housing advertising system discriminates against Facebook users based on their race, color, religion, sex, disability, familial status, and national origin. The settlement will not take effect until approved by the court.
Among other things, the complaint alleges that Meta uses algorithms in determining which Facebook users receive housing ads and that those algorithms rely, in part, on characteristics protected under the FHA. This is the department’s first case challenging algorithmic bias under the Fair Housing Act.
Under the settlement, Meta will stop using an advertising tool for housing ads (known as the “Special Ad Audience” tool) that, according to the department’s complaint, relies on a discriminatory algorithm. Meta also will develop a new system to address racial and other disparities caused by its use of personalization algorithms in its ad delivery system for housing ads. That system will be subject to Department of Justice approval and court oversight.
This settlement marks the first time that Meta will be subject to court oversight for its ad targeting and delivery system.
“As technology rapidly evolves, companies like Meta have a responsibility to ensure their algorithmic tools are not used in a discriminatory manner,” said Assistant Attorney General Kristen Clarke of the Justice Department’s Civil Rights Division. “This settlement is historic, marking the first time that Meta has agreed to terminate one of its algorithmic targeting tools and modify its delivery algorithms for housing ads in response to a civil rights lawsuit. The Justice Department is committed to holding Meta and other technology companies accountable when they abuse algorithms in ways that unlawfully harm marginalized communities.”
“When a company develops and deploys technology that deprives users of housing opportunities based in whole or in part on protected characteristics, it has violated the Fair Housing Act, just as when companies engage in discriminatory advertising using more traditional advertising methods,” said U.S. Attorney Damian Williams for the Southern District of New York. “Because of this ground-breaking lawsuit, Meta will — for the first time — change its ad delivery system to address algorithmic discrimination. But if Meta fails to demonstrate that it has sufficiently changed its delivery system to guard against algorithmic bias, this office will proceed with the litigation.”
“It is not just housing providers who have a duty to abide by fair housing laws,” said Demetria McCain, the Principal Deputy Assistant Secretary for Fair Housing and Equal Opportunity at the Department of Housing and Urban Development (HUD). “Parties who discriminate in the housing market, including those engaging in algorithmic bias, must be held accountable. This type of behavior hurts us all. HUD appreciates its continued partnership with the Department of Justice as they seek to uphold our country’s civil rights laws.”
The United States’ complaint challenges three key aspects of Meta’s ad targeting and delivery system. Specifically, the department alleges that:
The complaint alleges that Meta has used these three aspects of its advertising system to target and deliver housing-related ads to some Facebook users while excluding other users based on FHA-protected characteristics.
The department’s lawsuit alleges both disparate treatment and disparate impact discrimination. The complaint alleges that Meta is liable for disparate treatment because it intentionally classifies users on the basis of FHA-protected characteristics and designs algorithms that rely on users’ FHA-protected characteristics.
The department further alleges that Meta is liable for disparate impact discrimination because the operation of its algorithms affects Facebook users differently on the basis of their membership in protected classes.
These are the key features of the parties’ settlement agreement:
The Justice Department’s lawsuit is based in part on an investigation and charge of discrimination by HUD, which found that all three aspects of Meta’s ad delivery system violated the Fair Housing Act.
When Facebook elected to have the HUD charge heard in federal court, HUD referred the matter to the Justice Department for litigation.
This case is being handled jointly by the Justice Department’s Civil Rights Division and the U.S. Attorney’s Office for the Southern District of New York.
Assistant Attorney General Kristen Clarke and U.S. Attorney Damian Williams thanked the Department of Housing and Urban Development for its efforts in the investigation.
The Fair Housing Act prohibits discrimination in housing on the basis of race, color, religion, sex, familial status, national origin, and disability. More information about the Civil Rights Division and the laws it enforces is available at www.justice.gov/crt.
More information about the U.S. Attorney’s Office for the Southern District of New York is available at www.justice.gov/usao-sdny.
Individuals who believe they have been victims of housing discrimination may submit a report online at www.civilrights.justice.gov or may contact the Department of Housing and Urban Development at 1-800-669-9777 or through its website at www.hud.gov.
The San Marcos City Council received a presentation on the Sidewalk Maintenance and Gap Infill…
The San Marcos River Rollers have skated through obstacles after taking a two-year break during…
San Marcos Corridor News has been reporting on the incredible communities in the Hays County…
Visitors won't be able to swim in the crystal clear waters of the Jacobs Well Natural…
Looking to adopt or foster animals from the local shelter? Here are the San Marcos…
The Lone Star State leads the nation in labor-related accidents and especially workplace deaths and…
This website uses cookies.