SAN FRANCISCO – META agreed in a settlement with the Department of Justice on Tuesday to change its advertising target technology and pay a 115,054 fine, claiming that the company had banned advertisers from discriminating against housing. Who could see the ads on it? The platform is based on their race, gender and zip code.
Under the agreement, Meta, the company formerly known as Facebook, said it would change its technology and use a new computer-assisted method aimed at regularly testing whether it Audiences that are targeted and eligible to receive housing ads are, in fact, watching. The new method of advertising, called the “variable mitigation system”, relies on machine learning to ensure that advertisers are providing housing-related advertisements to a certain safe class of people.
Meta also stated that it will no longer use the feature called “Special Ad Audience”, a tool designed to help advertisers grow the groups of people they reach. Ads will arrive. The company said the device was an early attempt to combat prejudice, and that new methods would be more effective.
“We’re going to occasionally take a snapshot of the marketer’s audience, see who they’re targeting, and remove as much as we can from that audience,” said Roy L. Austin, Meta Civil Rights. Vice President and a Deputy General Counsel, said in an interview. He called it “a significant technological advancement to use machine learning to provide personalized advertising.”
Facebook, which has become a business group by collecting data from its users and allowing advertisers to target ads based on audience characteristics, has been facing complaints over the years that some of its practices are biased and Are discriminating. The company’s advertising system uses thousands of different features to allow marketers to choose who has seen their ads, which allows these advertisers to exclude people who fall into multiple safe categories. Are
While Tuesday’s settlement is related to housing advertising, Meta said it also plans to implement its new system to check employment and credit-related advertising targets. The company has previously been criticized for allowing discrimination against women in job advertisements and for excluding certain groups of people from viewing credit card advertisements.
Damien Williams, a U.S. attorney, said in a statement, “Because of this important lawsuit, Meta – for the first time – will change its ad delivery system to eliminate algorithmic discrimination.” “But if Meta fails to show that it has substantially changed its delivery system to avoid algorithmic bias, the office will proceed with the litigation.”
The issue of targeting biased advertisements has been debated, especially in housing advertisements. In 2018, Ben Carson, then secretary of the Department of Housing and Urban Development, announced a formal complaint against Facebook, accusing the company of having an advertising system based on race, religion and disability. But “illegally discriminate”. A 2016 investigation by ProPublica also revealed Facebook’s advertising distinctiveness, which made it easier for marketers to exclude certain ethnic groups for advertising purposes.
In 2019, HUD sued Facebook for engaging in housing discrimination and violating the Fair Housing Act. The agency said Facebook’s systems do not deliver ads to “diverse audiences”, even if the advertiser wants the ad to be widely viewed.
Mr Carson said at the time that “Facebook is discriminating against people on the basis of who they are and where they live.” “Using a computer to limit one’s choice of residence can be as discriminatory as knocking on someone’s face.”
The HUD suite came amid widespread pressure from civil rights groups to claim that the vast and complex advertising system, which is the basis of some of the larger Internet platforms, contains hereditary biases, and that meta, Tech companies like Google and others need to do more to bet. Return to those prejudices.
The area of study, called “algorithmic fairness”, has been a major topic of interest among computer scientists in the field of artificial intelligence. Leading researchers, including former Google scientists such as Timnett Gabro and Margaret Mitchell, have been sounding the alarm over such prejudices for years.
In subsequent years, Facebook has banned the types of categories from which marketers can choose when buying housing ads, reducing the number to hundreds and based on race, age and zip code. Can eliminate targeting options.
The new meta system, which is still in development, will occasionally check who the ads for housing, employment and credit are being offered to, and will make sure that the audience is aware of them. Find what marketers want to target. If the ads presented tend to lean more towards white men in their 20s, for example, the new system would theoretically recognize it and make the ads more equitable between a wider and diverse audience. Will move to be presented as
Meta said it would work with HUD in the coming months to incorporate technology into Meta’s advertising targeting system, and agreed to a third-party audit of the new system’s effectiveness.
The Department of Justice stated that the fine being paid in the meta-settlement is the maximum available under the Fair Housing Act.