Meta agrees to stop algorithmic discrimination in housing ads

The housing discrimination lawsuit, filed by the government in 2019, accused that Meta enabled and encouraged advertisers to target their housing ads by relying on race, colour, religion, sex, disability, familial status, and national origin to decide which Facebook users will be eligible, and ineligible, to receive housing ads.

Meta (formerly Facebook) has agreed to settle a lawsuit with the US government that accused the social network of engaging in discriminatory advertising for housing on its platform.

The housing discrimination lawsuit, filed by the government in 2019, accused that Meta enabled and encouraged advertisers to target their housing ads by relying on race, colour, religion, sex, disability, familial status, and national origin to decide which Facebook users will be eligible, and ineligible, to receive housing ads.

Advertisement

Under the settlement, Meta will stop using an advertising tool for housing ads which relies on a discriminatory algorithm to find users who "look like" other users based on Fair Housing Act (FHA)-protected characteristics.

Meta also will develop a new system over the next six months to address racial and other disparities caused by its use of personalisation algorithms in its ad delivery system for housing ads, the US Department of Justice said in a statement late on Tuesday.

Advertisement

If the US concludes that the new system adequately addresses the discriminatory delivery of housing ads, then Meta will implement the system, which will be subject to Department of Justice approval and court oversight.

Also Read | DoT scraps 3% floor rate on 5G spectrum usage charge 

Advertisement

However, if the government finds that the new system is insufficient to address algorithmic discrimination in the delivery of housing ads, then the settlement agreement will be terminated.

"Because of this ground-breaking lawsuit, Meta will -- for the first time -- change its ad delivery system to address algorithmic discrimination," said US Attorney Damian Williams.

Advertisement

"But if Meta fails to demonstrate that it has sufficiently changed its delivery system to guard against algorithmic bias, this office will proceed with the litigation," Williams added.

Meta's ad delivery system used machine-learning algorithms that rely in part on FHA-protected characteristics -- such as race, national origin, and sex -- to help determine which subset of an advertiser's targeted audience will actually receive a housing ad.

Advertisement

"As technology rapidly evolves, companies like Meta have a responsibility to ensure their algorithmic tools are not used in a discriminatory manner," said Assistant Attorney General, Kristen Clarke.

Also Read | 'Amount of financial aid to Ukraine since invasion stands at $30billion'

Advertisement

This settlement is historic, marking the first time that Meta has agreed to terminate one of its algorithmic targeting tools and modify its delivery algorithms for housing ads in response to a civil rights lawsuit.

"The Justice Department is committed to holding Meta and other technology companies accountable when they abuse algorithms in ways that unlawfully harm marginalised communities," Clarke added.

Advertisement

Advertisement
tags