7962
Facebook’s Gender Bias
Produced by RTS

VO

00:00:01,772 --> 00:00:26,036

-Finding job offers has never been easier

on the internet. In a few clicks you have hundreds of instant and personalized

ads. Thanks to their algorithms, Facebook and Google are widely used in

the recruitment world. Their belief is that machines are more efficient than humans.

but people are already speaking out.

 

ISABELLE COLLET

00:00:26,061 --> 00:00:37,990

-We're in a tech myth: where we think that because it is an algorithm, it is objective.

Ads are objective: not at all! Since mostly men and women develop software and algorithms

they're not objective.

 

VO

00:00:38,373 --> 00:01:46,506

The proof in images. -Look at this test. A man and a woman,

will do an identical job search within 5km. Neither specified their gender, experience, or age

but the results have nothing to do with one another. For her, secretary, babysitter,

and part-time work. For him, professor, team leader, mostly full time. Quite a sexist result, and we are not

at the end of our surprises. Our experience started in Berlin a few weeks ago, near the touristic Alexanderplatz. Nicolas Kayser-Bril works for an NGO monitoring algorithm. For us, he posed as a recruiter and posted real job offers on Google and Facebook: lawyer, nurse, driver. He wants to check

who really gets these ads.

 

NICOLAS KAYSER BRIL

00:01:46,951 --> 00:02:26,746

-Some advertisements have only been targeting men like the one for the drivers.

Others were aimed specifically at women, like nursing jobs, teachers. Discrimination takes place

before anyone clicks on the advertisement. Optimization is done according to unknown criteria

even before the advertising is shown.

 

VO

00:02:28,629 --> 00:02:42,381

-Facebook's algorithm was never made public. It's impossible to understand how

Facebook's algorithm makes its choices.  In Berlin, the team has taken

the experiment one step further.

 

NICOLAS KAYSER-BRIL

00:02:42,997 --> 00:03:09,545

-We changed the photos and we saw that Facebook uses the pictures from the ads to

discriminate based on stereotypes. If we do a commercial for drivers with a picture of cosmetics,

the ad will only be shown to women. If you specify that you want to female drivers

and put a picture  of heavy trucks, it will still be shown only to men.

 

VO

00:03:10,864 --> 00:03:42,567

This raises questions such as hiring discrimination is prohibited by law in Switzerland.

So are algorithms sexist or neutral? Facebook and Google didn't reply

to our interview requests. Far from Silicon Valley, Isabelle Collet is the pet peeve of the GAFAM.

This computer scientist is tracking gender discrimination created by digital technology

She followed the case of Amazon The giant failed to follow through with recruitment algorithms.

 

ISABELLE COLLET

00:03:44,379 --> 00:04:20,357-Amazon wanted to automate its recruitment. The idea was that the top five résumés

would be hired directly. There was a majority of men. They figured that the algorithm

was sexist, but no. Amazon HR was sexist. The algorithm replicated

what we used to do: for a promotion or a good salary, it was better to be a man.

It wasn't an algorithm, they needed to re-evaluate how they managed human resources.

 

VO

00:04:21,638 --> 00:04:53,261

-What about Switzerland? We're not at the stage of selecting CVs

or recruiting through algorithms, This leading company

of temporary workforce in Switzerland, gets 200'000 applications per year.

They're thinking about artificial intelligence, but they are also suspicious.

Gregory Papin brought AI into to the process, to speed up the experience of the candidates.

 

GREGORY PAPIN

00:04:53,910 -> 00:05:18,293

-When a person is applying, we just ask them for a CV and a robot

performs semantic analyses to recognize his coordinates. AI helps fill out a form

but the candidate still has the lead on that document. It would be difficult

to trust AI 100% in the processing of these applications.

 

VO

00:05:20,017 --> 00:05:32,410

-At the group's headquarters, we're thinking about the next step.

Cross-referencing CVs and offers to offer targeted jobs to the candidates. But the ethical issues

are not far away.

 

GREGORY PAPIN

00:05:33,261 --> 00:05:54,998

-We'd have an ethical problem to process candidates automatically. We need

our consultants' point of view. Our position now is not to exclude new technologies,

but you can't see everything from an artificial intelligence viewpoint.

We will always need humans in the processing of recruitment.

 

VO

00:05:56,260 --> 00:06:14,510

A few kilometres away others are already much further into AI In Martigny, in this well-known

institute research, we're working on an algorithm designed to analyze job interviews.

These men are the masterminds of this project.

 

JEAN-MARC ODOBEZ

00:06:15,400 --> 00:06:18,951

-We have examples of job interviews where our algorithms can learn

to recognize how emotions are expressed.

 

VO

00:06:23,858 --> 00:06:33,607

-Fear, joy, surprise: the aim is to help recruiters find out when a candidate

mimics his enthusiasm or if his speech is not consistent with his emotions.

The slightest movement of the muscles of the face is detected.

 

JEAN MARC ODOBEZ

00:06:44,101 --> 00:07:16,287

-For example, what we need to identify on the face is the raised eyebrows, characteristic of surprise

Open your mouth too, but it's rare when we talk. Eyebrows are important to watch out.

For sadness, the lips will fall down. For joy, it's the other way around.

For anger, it's the eyebrow twitch.

 

VO

00:07:17,580 --> 00:07:20,780

-The purpose of the machine is to be impartial. But many gender or age biases

may exist.

 

JEAN MARC ODOBEZ

00:07:28,199 --> 00:07:59,622

-An elderly person will have wrinkles that could be interpreted as expressions, even though the

no one expresses that emotion. Age must be provided so the data takes it into account.

When you do not have a CV, you cannot do it without an interview It can be a tool, but

it must remain only a tool. Behind them, humans make decisions.

 

VO

00:08:01,002 --> 00:08:05,556

Correcting the algorithms and neutralize the gender bias, It is possible

In the high security  biometric room, is one of Sebastien Marcel's duties. In this lab,

they train the algorithms to spot the real from the fake, especially thanks to

these silicon masks. To correct gender bias, you have to get to the root of it

and the ways teaching of data.

 

SEBASTIEN MARCEL

00:08:34,061 --> 00:08:37,569

The reason why some of these algorithms are biased is because they've been trained

on biased databases. We're trying to correct the data, matching the number of men and women

for example, but that's not always possible. Systems are often already created.

We no longer have access to the original data.

 

VO

00:09:04,654 --> 00:09:27,183

-The second option is to break the codes, or change the people who write them. Luanna Braga

studies code and algorithms at the University of Applied Sciences

in Geneva. In 2020, she's still an exception in a highly masculine universe.

 

LUANA BRAGA

00:09:28,813 --> 00:09:36,935

-The first day, I counted, there were four of us out of 65. In my class,

we're two girls out of 14. I'm used to it, but I wish I could have more women by my side.

 

VO

00:09:58,179 --> 00:10:01,535

-That night, creating a game to learn writing command lines. Luanna's holding on because she thinks

that more diversity could change the way algorithms govern society.

 

LUANA BRAGA

00:10:14,090 --> 00:10:58,911

-We have a percentage of 90% white and European men and they're bound to make

things programmed for them. I do not think it's voluntary. The ones that are here

do as they think. To me, it's strategy. I wanted to see the areas where I could stand out. In other areas,

it's very difficult. Now it's the other way around.

 

VO

00:10:59,978 --> 00:11:04,218

-In the meantime, it's up to the recruiters to be extremely careful. Charlène Kurer and her associate

are aware of this. They need to recruit assistants for wellness centre.

 

CHARLENE KURER

00:11:20,778 --> 00:11:25,538

-There will be seats

where we can lie down.

 

VO

00:11:25,686 --> 00:11:27,726

-To find the perfect assistant she chose to place her ad only on Facebook.

 

CHARLENE KURER

00:11:31,667 --> 00:11:33,867

-What worked well is the localisation. I'm looking for someone dynamic,

available 10 to 40% of the time. We received 40 applications.

 

VO

00:11:48,307 --> 00:11:50,827

A non-gender specified ad

and a neutral image,

here, we've paid attention to detail

hoping that the algorithm

does not bias this recruitment.

 

CHARLENE KURER

00:11:57,991 --> 00:12:00,751

-We'd like men like women, it would be a problem that

the algorithm chooses for us criteria that we didn't set.

 

VO

00:12:16,273 --> 00:12:39,047

-The efforts paid off for the recruiters, as their ad attracted

as many men as women. Facebook and Google still haven't answered us

and have not announced a corrigendum of their algorithm.

Our questions have nonetheless reached them

since Google has removed the majority of our ads

even though they were paid for and still valid.

 

© 2024 Journeyman Pictures
Journeyman Pictures Ltd. 4-6 High Street, Thames Ditton, Surrey, KT7 0RY, United Kingdom
Email: info@journeyman.tv

This site uses cookies. By continuing to use this site you are agreeing to our use of cookies. For more info see our Cookies Policy