Wednesday, September 28, 2022
HomeTechnologyDOJ and Meta settle landmark housing discrimination case

DOJ and Meta settle landmark housing discrimination case

- Advertisement -
amp

placeholder when loading article action

Facebook owner Meta agreed to revamp the social network’s targeted advertising system in a sweeping settlement with the U.S. Department of Justice after it was accused of allowing landlords to discriminately market its housing ads.

The settlement stems from a 2019 Fair Housing Act lawsuit brought by the Trump administration and is the second of its kind to agree to changes to its advertising system to prevent discrimination. But Tuesday’s settlement went further than the first, requiring Facebook to overhaul its powerful internal ad targeting tool, Lookalike Audiences. Government officials say the product facilitates housing discrimination by allowing advertisers to target housing-related ads by race, gender, religion or other sensitive characteristics.

amp

Under the settlement, Facebook will build a new automated advertising system that the company says will help ensure housing-related ads are served to a fairer mix of the population. The social media giant must submit the system to a third party for review, the settlement said. Facebook, which changed its name to Meta last year, also agreed to pay $115,054 in fees, the maximum fine under the law.

“This settlement is historic and marks the first time Meta has agreed to terminate one of its algorithmic targeting tools and revise its housing ad serving algorithm in response to civil rights litigation,” said Kristen Clarke, Assistant Attorney General for Civil Litigation at the Justice Department. Rights Division.

Facebook spokesman Joe Osborne said advertisers can still target their ads to users in specific locations, but not based solely on zip codes and users with limited interests.

Facebook now has a legal obligation to stop advertisers from excluding people based on race

Roy Austin, Facebook’s vice president of civil rights, said in a statement that the company will use machine learning technology to try to more fairly distribute who sees housing-related ads, regardless of how those marketers make their decisions by considering age, gender and likely ads. Target their ads. user’s race.

“Discrimination in housing, employment and credit is a time-honored and deep-rooted problem in America, and we are committed to expanding opportunities for marginalized communities in these fields and beyond,” Austin said in a statement. The advertising industry is unprecedented and represents a major technological advance in how machine learning can be used to deliver personalized ads.”

Federal law prohibits housing discrimination based on race, religion, national origin, gender, disability or familial status.

The deal comes after Facebook faced a series of legal complaints from the Justice Department, state attorneys general and civil rights groups, who argued that its algorithm-based marketing tools specifically gave advertisers the unique ability to target ads to minority groups. Discrimination against minorities and other vulnerable groups in housing, credit and employment.

In 2019, Facebook agreed to stop allowing advertisers to use gender, age and zip code (often as a proxy for race) to market housing, credit and job openings to its users. The change comes after an investigation by the Washington state attorney general and a ProPublica report found Facebook allowed advertisers to use its microtargeting ads to hide housing ads from African-American users and other minorities. Subsequently, Facebook said it would no longer allow advertisers to use the “racial affinity” category for housing, credit and employment ads.

HUD is reviewing Twitter and Google advertising practices as part of housing discrimination probe

But since the company agreed to the settlements, researchers have found that Facebook’s systems may continue to discriminate further, even if advertisers are barred from checking specific gender, race or age boxes. In some cases, its software detected that people of a certain race or gender frequently clicked on specific ads, and then the software began to reinforce those biases by serving ads to so-called ‘lookalike audiences,'” Peter Romer-Fried said. said Mann, principal at Gupta Wessler PLLC law firm.

The result could be that a certain housing ad is shown only to men, even if the advertiser isn’t specifically trying to show ads only to men, Romer-Friedman said, who has filed several complaints with the company. Civil rights lawsuits, including a 2018 settlement. The company agreed to limit ad targeting categories.

Romer said the settlement is a “huge achievement” because it’s the first time a platform has been willing to make major changes to its algorithms in response to a civil rights lawsuit.

technology.

RELATED ARTICLES

Latest News