Facebook & Gender Discrimination in Distribution of Job

Years after the company first promised to take action in other regions, Facebook-parent Meta is now facing four new complaints from human rights organizations in Europe that the algorithm it uses to target users with job advertisements is discriminatory.

The accusations are supported by research from the international nonprofit organization Global Witness, which claims to demonstrate that Facebook’s ad platform frequently targets users with job postings based on traditional gender stereotypes. According to information Global Witness gathered from Facebook’s Ad Manager platform, job advertisements for mechanic positions, for instance, were predominately shown to male users while ads for preschool teachers were predominately shown to female users.

The human rights organization, Global Witness, claims that additional research shared exclusively with a news network indicates that this algorithmic bias is a global problem.

According to Naomi Hirst, who oversees Global Witness’ campaign strategy on digital threats to democracy, “Our concern is that Facebook is exacerbating the biases that we live within society and thwarting opportunities for progress and equity in the workplace.”

Based on their investigation in both countries, Global Witness, Bureau Clara Wichmann, and Fondation des Femmes filed complaints about Meta (FB) with the French and Dutch data protection authorities and human rights organizations on Monday. The organizations are requesting that the agencies look into whether Meta’s actions are in contravention of the nations’ data protection or human rights laws. Meta may eventually be subject to fines, sanctions, or demands that it make additional changes to its product if any of the agencies find the accusations to be true.

Concerning similar discrimination issues, which are still being looked into, Global Witness has previously filed complaints with the UK Equality and Human Rights Commission and the Information Commissioner’s Office. Global Witness reported that at the time, a representative for Meta, then still known as Facebook, claimed that the company’s “system takes into account different kinds of information to try and serve people ads they will be most interested in,” and that it was “exploring expanding limitations on targeting options for job, housing, and credit ads to other regions beyond the US and Canada.”

The complaints from Europe also echo one that a women’s trucking organisation, Real Women in Trucking, made to the US Equal Employment Opportunity Commission in December, alleging that Facebook discriminates against users based on their age and gender when choosing who gets to see job advertisements. Regarding the complaint against Real Women in Trucking, Meta declined to comment on news sources.

According to a statement from Meta spokesperson Ashley Settle, Meta places “targeting restrictions on advertisers when setting up campaigns for employment, as well as housing and credit ads, and we offer transparency about these ads in our Ad Library.”

Keep Reading

Settle said in the statement, “We do not permit advertisers to target these ads based on gender. On the best approaches to research and address algorithmic fairness, “we continue to work with stakeholders and experts across academia, human rights groups, and other disciplines.”

Meta made no specific comments regarding the recent complaints made in Europe. Additionally, when asked in which nations it now restricts the targeting options for the job, housing, and credit ads, the company remained silent.

Not getting hired because of your gender- Facebook

Over the past ten years, Facebook has been accused of discriminating in several ways, including how it distributes job listings. In 2019, the platform committed to making changes to prevent biased delivery of housing, credit, and employment ads based on protected characteristics, such as gender and race. This was part of an agreement to settle numerous lawsuits in the United States.

Human rights organizations claim that efforts to address those disparities included getting rid of the option for employers to target job ads based on gender, but this most recent study indicates that change may be being thwarted by Facebook’s algorithm.

The groups assert that as a result, numerous users might not be seeing open positions for which they might be qualified simply because of their gender. They worry that this might make pay disparities and historical workplace inequities worse.

According to Linde Bryk, head of strategic litigation at Bureau Clara Wichmann, “You cannot escape big tech anymore; it is here to stay and we have to see how it impacts women’s rights and the rights of minority groups.” “As a corporation, it’s too simple to disappear behind the algorithm simply, but if you put something on the market, you should also have some level of control over it,” the speaker said.

Global Witness carried out additional studies in four different nations, including India, South Africa, and Ireland, and claims the findings demonstrate that the algorithm exacerbated similar biases everywhere.

Facebook can be a valuable resource for users looking for employment because it has more than 2 billion daily active users around the world.

For ad buyers to see a return on their investment in the platform, the platform’s business model depends on its algorithm carefully targeting advertisements to the users it believes are most likely to click on them. But according to research by Global Witness, this leads to the targeting of users with job advertisements based on gender stereotypes. In some instances, the biases that appear to be displayed by Facebook’s ad system, according to human rights advocates, may even make other disparities worse.

The people most impacted by Facebook’s alleged algorithmic biases may be those already in marginalized positions, according to Caroline Leroy-Blanvillain, a lawyer and member of the legal force steering committee at Fondation des Femme. In France, for instance, Facebook is frequently used for job searches by people of lower income levels.

The leader of Amnesty International’s big tech accountability team, Pat de Brn, stated that the findings of Global Witness’ research did not necessarily come as a surprise. According to de Brn, “research repeatedly demonstrates how Facebook’s algorithms produce profoundly unequal outcomes and frequently reinforce marginalization and discrimination.” And what we observe is the replication and amplification of some of society’s negative traits.

The algorithms give us the impression that they are neutral, but they frequently reproduce biases, hide them, and make them more difficult to challenge, the author claimed.

Gender discrimination

Global Witness placed several job postings in France and the Netherlands for two days at a time between February and April to carry out the experiments mentioned in the complaints. The researchers chose positions, such as preschool teacher, psychologist, pilot, and mechanic, that are customarily associated with gender stereotypes, and the advertisements linked to actual job postings discovered on employment websites.

The ads were directed at adult Facebook users of any gender who were either residents of the selected countries or had recently traveled there. The researchers asked that the ads “maximize the number of link clicks,” but other than that, they deferred to Facebook’s algorithm for deciding who saw the ads in the end.

An analysis of the information provided by Facebook’s ad manager platform revealed that the ads were frequently displayed to users in a highly gendered way.

One of the complaints from the Netherlands reads, “Just because advertisers can’t select it, doesn’t mean that the ‘gender’ [category] doesn’t weigh in the process of showing ads at all.”

According to Facebook’s ad manager platform, women made up just 25% of users who saw a pilot job ad and 6% of users who saw a mechanic job ad in France, while women made up 93% of users who saw a preschool teacher job ad and 86% of users who saw a psychologist job ad.

Similarly, according to Facebook’s data, in the Netherlands, 85% of users who saw teacher job ads and 96% of users who saw receptionist job ads were women, but only 4% of users who saw mechanic job ads were women. A job ad for package delivery, for instance, was shown to 38% of female users in the Netherlands. Other roles were less skewed.

The outcomes were consistent with what Global Witness discovered in the UK, where men were overwhelmingly shown job advertisements for mechanic and pilot positions, while women were more frequently shown advertisements for positions as a psychologist and nursery teachers.

The pursuit of algorithmic transparency

Five discrimination lawsuits and charges were filed against Facebook between November 2016 and September 2018 by US civil rights and labor organizations, employees, and individuals. These lawsuits claimed that the company’s ad systems prevented some people from seeing housing, employment, and credit ads based on their age, gender, or race.

The legal actions came after a flurry of critical coverage of Facebook’s advertising systems, including a 2018 ProPublica investigation that discovered Facebook was aiding the spread of discriminatory advertisements by allowing employers to use its platform to target only one sex with job advertisements. According to the report, some businesses advertised jobs for trucking or police jobs exclusively to men, while others advertised jobs for nursing or medical assistants exclusively to women. (A Facebook representative responded to the report at the time, stating that discrimination is “strictly prohibited in its policies” and that the company would “defend our practices.”)

Facebook agreed to pay almost $5 million to resolve the lawsuits in March 2019. The business added that it would introduce a different, less targeted, Facebook, Instagram, and Messenger ad portal for housing, employment, and credit ads.

Romer-Friedman participated in the 2019 Facebook settlement talks as a member of the negotiating team. He recalled that at the time, he and others expressed concern that, despite Facebook’s promised changes being a step in the right direction, the platform’s algorithm might still reproduce the same bias issues.

Real Women in Trucking filed an EEOC complaint, but Meta declined to comment on it because agency filings are confidential.

he human rights organisations may decide to take action based on Global Witness’ and its partners’ findings, and this could put pressure on Meta to change its algorithm, increase transparency, and stop further discrimination. If the countries’ data protection authorities decide to look into the matter and ultimately determine that Meta violated the EU’s General Data Protection Regulation, which forbids the discriminatory use of user data, they may decide to impose hefty fines on the company.

About WR News Writer

WR News Writer is an engineer turned professionally trained writer who has a strong voice in her writing. She speaks on issues of migrant workers, human rights, and more.

WR News Writer

WR News Writer is an engineer turned professionally trained writer who has a strong voice in her writing. She speaks on issues of migrant workers, human rights, and more.

Recent Posts

History is made today: Colombia passes bill to eradicate child marriage

Colombian politicians recently approved a bill to ban child marriage in the country after 17 years of campaigning by rights…

November 16, 2024

Mozambique election protests: Rights experts raise alarm about repression of demonstrators

UN independent human rights experts are calling on authorities in Mozambique to prevent and end repression of protesters after the…

November 16, 2024

First high-profile person to be affected by CNN layoffs, Anchor Chris Wallace

As the network deals with a staffing crisis in the face of declining ratings, CNN anchor Chris Wallace was the…

November 16, 2024

Chicago’s Largest Migrant Shelter Closes as City Transitions to ‘One System Initiative’

Chicago City was able to record a transition regarding its migrant housing policy after the shutdown of its largest shelter…

November 16, 2024

Britain sees major migrant influx: Can Labour party find an effective solution?

Britain has seen the highest increase in migrant arrivals in 2023 - more than any other major economy across the…

November 15, 2024

France-Israel football match: Scuffles seen at Stade de France despite sparse attendance

Inside a sparsely attended Stade de France on Thursday for a France-Israel football match, some French fans booed the Israeli…

November 15, 2024

This website uses cookies.

Read More