Facebook denies its algorithms helped advertisers carry out housing discrimination but settles with the US govt regardless
Meta, formerly Facebook, has settled a lawsuit with the US government that accused the company of helping advertisers discriminate against specific audiences when it came to showing housing ads. Per the agreed-upon settlement, Meta will need to stop using its old algorithms that are discriminating against protected groups in favor of an algorithm that aims to “address racial and other disparities” present in Meta’s ad delivery system. The company will also need to pay a fine of US $115,054.
According to the US Department of Justice (DOJ), Meta’s Special Ad Audiences tool and the associated algorithm made it possible for advertisers to filter people based on their race, sex, and national origin and only showed housing ads to a certain demographic. DOJ claims that this ad servicing policy violates the Fair Housing Act.
Per The Verge, although Meta has agreed to a settlement, the company maintains that it didn’t do anything wrong, and the settlement agreement isn’t an admission of guilt. Meta will have to develop the new system and prove that it works by December 2022. After the new system gets the green signal from the government, a third party will keep tabs on how the system behaves and whether it shows ads in a fair manner.
Meta issued a statement on Tuesday outlining the steps it is taking to resolve the issue. According to the statement, the new system will use machine learning to ensure that the people who are seeing the ads are the right target audience.
Are you a techie who knows how to write? Then join our Team! Wanted:
- News Writer (Romania based)
Details here