POST
PRODUCTION
SCRIPT
Foreign
Correspondent
2021
American
Deepfake
29
mins 28 secs
©2021
ABC
Ultimo Centre
700
Harris Street Ultimo
NSW
2007 Australia
GPO
Box 9994
Sydney
NSW
2001 Australia
Phone:
61 419 231 533
Precis
|
A new generation
of "deepfake" videos has got Hollywood excited ... and Washington
worried. They’ve got the potential to change reality
as we know it. Deepfakes are
synthetic media created by Artificial Intelligence - technology that can make
real people say and do things they never actually did. This week,
reporter Hamish Macdonald does a deep dive into an emerging technology with
explosive potential. Until recently,
deepfakes videos were mostly a staple of the dark web, their subject matter
mainly porn. Now the
technology is coming out of the shadows and into the mainstream. In Bangkok, we
meet a video effects expert with a reputation as one of the world's best
deepfakers, who recently created a Tom Cruise TikTok video - that wasn’t Tom Cruise. "I see the
creative possibilities," deepfaker Chris tells Foreign Correspondent.
"I think we’re coming to a point where everything is synthetic, especially
in the movie industry." "I think
it's a good thing I created these videos, because now I'm raising awareness
and (people) realize… this is real…it's coming!" TikTok Tom was
so convincing, he beat nearly all the latest deepfake detection technology,
racking up millions of online views and becoming a global AI sensation. But some US
security experts see him as a harbinger of dangers to come, a technology that
can now be weaponised; fake politicians declaring war...fake CEOs triggering
the collapse of financial markets. In Washington,
we meet a former CIA officer, now a lawyer advising corporate America on how
to combat the rising tide of digital disinformation. "I think
that we've only really started to scratch the surface of the bad things that
can happen because of deepfakes," says Matt. "In the national
security context, there’s no end to the nightmare scenario." Recently the FBI
issued an unprecedented warning to business and financial markets, declaring
that that it was "…almost certain’ that in the next 12 – 18 months,
(they) would be the victims of synthetic media (attacks)." "I think
deepfakes play into the hands of anybody, any state sponsor, any institution
that wants to create confusion or deceive," says Mounir, a former US
diplomat, once based in Syria, where he attempted to visually distil fact
from fiction. So will TikTok
Tom, born of neural networks and machine learning, be harnessed
and regulated, to entertain us? - or will he usher in a real-world version of
Hollywood's "dystopian future"? Stay tuned. |
|
AV
Montage. Fake Tom Cruise |
FAKE
TOM CRUISE: "My fellow Americans, you deserve the truth and I know you
can handle it! |
00:00 |
'Concession'
excerpt |
I
won’t be your next president. That point has been
made crystal clear." |
00:06 |
Fake
Obama |
FAKE
OBAMA: "We're entering an era in which our enemies can make it look like
anyone is saying anything at any point in time. Like President Trump is a
total and complete dip-shit." |
00:12 |
Title
over Washington GVs: |
Music
|
00:21 |
Fake
Jeremy in Washington/Deepfakes montage |
FAKE
JEREMY FERNANDEZ: Hello, welcome to Foreign Correspondent, I’m
Jeremey Fernandez. Well, it took a global pandemic to force us all indoors,
and to increasingly rely on video screens to connect our lives. But, is seeing still believing? This program delves into
the world of Deepfakes. |
00:30 |
|
NINA
SCHICK: A deepfake is essentially a piece of synthetic or fake media that's either been generated entirely by artificial
intelligence or manipulated by artificial intelligence. |
00:50 |
David
Beckham |
DAVID
BECKHAM: "Malaria isn’t just any disease…" |
01:03 |
Nina
interview |
NINA
SCHICK: Which by the way, includes fake video of real people saying and doing
things that they never did. |
01:08 |
Fake
Tom Cruise in desert |
FAKE
TOM CRUISE: "This is no stunt." |
01:14 |
Fake
Jeremy |
JEREMY
AVATAR: Hollywood is delighted by the movie-making potential of Deepfakes,
but in Washington, the tech is increasingly viewed as dangerous and
subversive. |
01:17 |
Fake
Putin |
FAKE
PUTIN: "You'll blame me for
interfering with your democracy." |
01:26 |
Deepfakes
montage |
MOUNIR
IBRAHIM: Deepfakes plays into the hands of anybody, any state sponsor, any
institution that wants to create confusion or deceive. |
01:29 |
|
MATT
FERRARO: The Deepfakes are a fundamental threat to democracy and to any
civilisation that relies on the truth. Deepfakes could very well undermine
our sense of reality. |
01:40 |
|
FAKE
JEREMY FERNANDEZ: So, are we entering a world where artificial intelligence
will distort our sense of reality? |
01:50 |
Fake
Jeremy to camera |
Perhaps,
as I have a confession to make. I am actually a Deepfake,
and you’re now watching my computer-generated avatar, that Hamish McDonald
and I have created. |
01:56 |
Hamish
at keyboard creating Jeremy avatar |
What’s
extraordinary is that almost anyone can now do this; all you need is a credit
card and internet access. |
02:09 |
Jeremy
joins Hamish at computer and look at Jeremy Deepfake |
Jeremy:
"Show me what you’ve got." Hamish:
"It’s time to see something that will probably blow your mind a little
bit." Jeremy: "A Jeremy Fernandez
deepfake." Hamish:
"It might also mean the end of your career… |
02:17 |
|
FAKE
JEREMY: "I'm Jeremy Fernandez. Tonight's program delves into the world
of deepfakes, also called synthetic media." |
02:26 |
|
Jeremy:
"What do you think? Hamish:
"It’s pretty good!" Jeremy:
"I have to say, I’m not entirely convinced. |
02:32 |
|
But
I think you put the three elements together of the location, the face, the
voice, it makes a reasonably compelling case. FAKE
JEREMY FERNANDEZ: "Can we still say that seeing is believing?" Jeremy:
The thing is, I’ve never said those words to camera
before, this has all been generated by the program." |
02:37 |
Synthesia
website video |
Music
|
02:54 |
|
HAMISH
MACDONALD, Presenter: The system was
developed by two companies, Synthesia and Descript. |
02:57 |
Deepfake
Jeremy creation |
We’ve
created a Deepfake Jeremy by providing the program with video and audio
samples, but you can simply pick your avatar, language, type in the words,
wait a few minutes for processing, and there it is. FAKE
JEREMY: Hola… [in Spanish] |
03:02 |
Hamish
to camera. Super: |
HAMISH
MACDONALD, Presenter: This is all
pitched as a corporate training tool, but there are some restrictions. The
developers prohibit you from creating any content that’s
discriminatory, political or sexually offensive. So, there’s
a limit to what we can do with this Jeremy avatar. I think it all looks pretty impressive, but it actually pales in comparison to
what’s coming. In the United States, some of the security experts say they
are terrified by the next generation of deepfakes." |
03:20 |
AV
montage. Deepfakes |
Music
|
03:49 |
Bangkok.
Night/Digital billboards |
CHRIS
UME: I like living the life of a digital nomad. I never imagined that I would
be a digital nomad, but somehow I became one. And it's pretty cool. |
04:19 |
Chris
walks. Bangkok, night. |
HAMISH
MACDONALD, Presenter: Chris Ume, video
effects designer, is said to be one of the world’s best deep-fakers.
His computer servers are in Belgium. He worked for the US-based animation
team that produces South Park. Now he’s riding out
the pandemic here in Bangkok. CHRIS
UME: I did a few months of research, how to do it, and a half year later I
had my first deepfake, |
04:32 |
Chris
100% |
and
then it evolved rather quickly. Because I saw the potential and I just
started creating crazy things. |
04:56 |
Bangkok.
Chris at café on computer |
HAMISH
MACDONALD, Presenter: It was his
latest "crazy thing" that got American national security experts
downloading TikTok. |
05:03 |
Fake
Tom Cruise video/Creation of fake Cruise |
FAKE
TOM CRUISE "What’s up TikTok? You guys cool if I play some sports?" |
05:13 |
|
HAMISH
MACDONALD, Presenter: Teaming up with
Tom Cruise impersonator, Miles Fisher, and some cutting-edge technology,
Chris produced a succession of increasingly sophisticated deepfakes. |
05:20 |
|
FAKE
TOM CRUISE: "Hey, listen up
sports and TikTok fans. If you like what you're
seeing, just wait 'til what's coming next. |
05:32 |
Chris
at computer |
CHRIS
UME: I love how he's
not looking in the camera. He's looking a bit wonky
next to the camera. Miles looks a lot like Tom Cruise. |
05:39 |
Miles
creating Deepfake Cruise |
He
has a lot of similarities. That makes my work easier in a way. It saves me a
lot of time. The Tom Cruise videos are so special because deepfakes shouldn't be able to do such things. HAMISH
MACDONALD, Presenter: The cut and
paste look of early deepfakes made them easy to spot. |
05:52 |
Deepfake
Cruise, TikTok |
Not
anymore. In response, the tech industry’s now investing heavily in software
detection tools. But TikTok Tom, reportedly beat nearly all of them. |
06:12 |
Chris
100% |
CHRIS
UME: It's like a cat and mouse game. They can't follow. And of course the
deep Tom videos, they are pretty good. I'm not
saying their flawless, |
06:24 |
Fake
Cruise, magic trick |
because
I see a lot of mistakes myself. And they cannot detect it. FAKE
TOM CRUISE: "I'm going to show you some magic." HAMISH
MACDONALD, Presenter: Deepfake Tom’s
near-flawless performance went viral. CHRIS
UME: We never exposed ourselves in the beginning. People didn't
know who created these videos. FAKE
TOM CRUISE: "It's all real. Think." |
06:33 |
Chris
100% |
CHRIS
UME: And you had a lot of articles talking about the end of the world, and
this technology is getting out of hand, and anyone can do this right now. So it was really difficult. I had to think about how to
approach this. |
06:51 |
Chris
making Tom Cruise deepfake |
I
see the creative possibilities and I just want to entertain people. I want to
make people smile when they look at my content. But of
course there will be people misusing this technology. HAMISH
MACDONALD, Presenter: So, how did he
do it? |
07:04 |
Fake
Cruise, 'making of' video |
DEEPFAKE
TOM CRUISE: "In this reel, you’re going to see my amigo, Chris Ume. He’s
going to introduce you to the wonderful world of deepfakes, how AI and VFX
are unlocking the future of our imagination." |
07:21 |
Chris
shows making of data |
CHRIS
UME: For Deepfake you start with the source data, and by source data I mean
pictures and videos of the character you want to deepfake. In my case it's Tom Cruise. Around 6,700 stored images of all his
angles, of all his expressions. |
07:33 |
Images flash framing on screen between computers |
HAMISH
MACDONALD, Presenter: The key to
building a deepfake is what’s called a Generative
Adversarial Network. It takes two ‘neural networks’ which are algorithms that
mimic the way our brains work; they learn from each other and adapt. It then
pits the two computers against each other to build a perfect deepfake. |
07:55 |
Chris
at computer |
CHRIS
UME: It took me two and a half months in total just to prepare it, to get it
running, to get where the model is at.
And after that it took two to three days to train on each video
separately, and up to 24 hours of post-production. Going frame by frame,
working on details, working on glitches. If you compare the quality we could achieve for deepfakes a year ago and what
we can achieve now, it's a day and night difference. It's
unbelievable. |
08:14 |
|
I've
been talking with certain governments as in advising. In a good way, right? A
lot of governments are still unknown to what deepfakes is, and I'm here to explain what it is, and which things are
possible. |
08:44 |
Chris
100% |
And
government realise this could be used as a weapon. |
08:59 |
Capitol
building time-lapse |
Music
|
09:05 |
Washington,
night |
HAMISH
MACDONALD, Presenter: Chris Ume’s
TikTok-Tom startled a Washington still traumatised by the chaos of the Trump
years. |
09:09 |
|
MATT
FERRARO: I think terror is probably not too strong of a word. I think it's because people realise how dangerous they are. It is
a Tom Cruise effect. I mean, in your mind, think about the Tom Cruise video
and then manipulate what he says. Just think about what he might say. |
09:08 |
Trump
Jan 6 speech |
TRUMP:
"We will stop the steal. I know that everyone here will soon be marching
over to the Capitol building." |
09:33 |
Jan
6 Capitol riot |
|
09:42 |
|
HAMISH
MACDONALD, Presenter: The mob that
assaulted the US Capitol on January 6 this year was fuelled by a mix of
anger, fake news, and alternative facts, egged on by a President claiming he’d been robbed of election victory. |
09:47 |
Ferraro
walks, Washington, night |
For
Matt Ferraro, former CIA officer and disinformation specialist, it was all
too close to home. MATT
FERRARO: Yeah. So this is my neighbourhood. |
10:05 |
|
It’s
funny, during the pandemic, I try to make a point of walking every day. I
would come out this way and to see what happened on January 6, it was
terrible. Desecration is really the word that comes to mind. |
10:16 |
Jan
6 Capitol riot |
These
people thought the election was stolen, they were tricked by lies, that were as
basic as could be, just written and spoken lies, and pretty
soon you’re going to have tailor-made video to convince you of
anything. |
10:32 |
Ferraro
walks, Washington, night |
HAMISH
MACDONALD, Presenter: He worked for
America’s top spymaster, the Director of National Intelligence, who reports
directly to the President. Matt Ferraro fears deepfakes will open the door to
a dark, dystopian future. MATT
FERRARO: The real danger is what we
saw on January 6 is just the beginning, that things are going to get so much
worse, the partisanship that we see, the visceral hatred for one another, is
going to just be made so much worse by deepfakes, and the fact that anyone is
going to be able to order up their own synthetic reality, regardless of the
truth. |
10:53 |
Capital
building view through security fence |
Music
|
11:33 |
|
HAMISH
MACDONALD, Presenter: Now a lawyer, he
advises corporate clients on how to counter the rising tide of digital
disinformation. |
11:39 |
Ferraro
100% |
MATT
FERRARO: Now, what if I told you that seeing isn't
believing, that what you saw with your own eyes was not in fact true? What
you heard with your own ears was not true?
Even what you read, thinking it was by a human, was in fact written by
a computer. I think that would have a fundamental effect on how you perceive
the world. |
11:47 |
FBI
building exterior, night. |
Now,
in the national security context, there’s no end to
the nightmare scenarios. The FBI put out this rather extraordinary warning in
March of 2021, in which they said that it was "almost certain" that
in the next 12 to 18 months, private businesses would be the victims of
synthetic media – by that they mean deepfakes – |
12:05 |
Ferraro
100% |
in
both cyber-attacks launched by nation states, and foreign adversaries, and by
criminals. |
12:29 |
Celebrity
images on computer |
HAMISH MACDONALD, Presenter: The focus is now on threats to politics and
business, but ever since the first low-quality deepfakes emerged about five
years ago, they’ve been weaponised – against
women. The figures are disturbing. The number
of professionally-made deepfakes is now doubling
every six months. 93% of it is porn.
American targets account for more than 40% of victims globally, much
of it celebrity-porn of actors, musicians, social
media stars. |
12:36 |
Sydney.
Night. |
Music
|
13:13 |
|
HAMISH
MACDONALD, Presenter: Humiliation intimidates most deepfake porn victims into
silence. Most, but not all. |
13:20 |
Sydney.
Night. Hamish walks with Noelle |
Noelle
Martin, is locked in a constant battle with anonymous deep-fakers
on the dark web. |
13:31 |
|
"What
are the consequences of you choosing to speak out about it now?" NOELLE
MARTIN: Well, the consequences of speaking out about the abuse has meant that
they’ve gone from manipulating images of me to creating these deepfakes, so
they’ve essentially weaponised this technology against
me. HAMISH
MACDONALD, Presenter: So they punish you? They go after you? NOELLE
MARTIN: Yes. HAMISH
MACDONALD, Presenter: Because you’ve criticised them? NOELLE
MARTIN: Yes. |
13:38 |
|
Music
|
14:06 |
|
HAMISH MACDONALD, Presenter: A 26 year old law graduate,
Noelle has now been campaigning for almost a decade. It all started when, as
a teenager, |
14:10 |
Noelle.
Random porn images |
she
was the victim of a random porn attack; her images posted in low-rent,
obvious, fakes. |
14:19 |
Noelle
interview |
NOELLLE
MARTIN: They had been targeting me
presumably since I was 17. And it was only until about a year later that I actually found out. |
14:28 |
Noelle
social media images |
So
they've gotten images from me from my social media, but they weren't just
collecting it from my social media. |
14:36 |
Noelle
interview |
They
were collecting it from friends' accounts. They were looking at |
14:43 |
Noelle
social media images |
the
bars that I went to and the photos from that event on Facebook, and they
would collect it from there. |
14:48 |
Noelle.
Censored porn images |
|
14:54 |
Hamish
and Noelle look at images on computer |
|
15:00 |
|
HAMISH
MACDONALD, Presenter: Oh wow. NOELLLE
MARTIN: That’s
fake me! That’s a complete fabrication! |
15:04 |
|
HAMISH
MACDONALD, Presenter: It’s escalated a lot from the beginning, hasn’t it? NOELLLE
MARTIN: Yes, it has. Yes. HAMISH
MACDONALD, Presenter: Like really obvious Photoshopping, to that sort of stuff? |
15:10 |
Noelle.
Censored porn images |
NOELLLE
MARTIN: Yes. It's
so dehumanising and degrading and violating. It makes you sick and it's humiliating. |
15:21 |
Noelle
interview |
HAMISH
MACDONALD, Presenter: You said it
almost destroyed you? NOELLLE
MARTIN: Yes. HAMISH
MACDONALD, Presenter: How close did it
come? NOELLLE
MARTIN: Very, very close. |
15:29 |
|
I
didn’t have the right coping mechanisms, so I really
struggled to cope, and there was a time where I did try and hang myself, and
my father stopped me. |
15:39 |
Hamish
and Noelle look at images |
HAMISH
MACDONALD, Presenter: Noelle’s
advocacy helped drive the introduction of revenge porn laws in Australia. But
she’s not stopping there. |
15:56 |
|
NOELLE
MARTIN: I really think that there needs to be some sort of global response,
because it is not enough for any one country to target this issue |
16:04 |
Noelle
interview |
that
is global in nature and borderless.
And it is not acceptable that in 2021, someone from halfway around the
world, can misrepresent you and there's nothing you
can do. And that impacts your whole life. |
16:14 |
Computer
screen 'deepfake' tag |
NINA
SCHICK: Deepfake porn is still, without a doubt, the most pernicious use case
of malicious deepfakes. But rather than this being a tawdry woman's issue, I
see this as a harbinger of what's to come. |
16:27 |
Nina
CU. Voice over |
My
name is Nina Schick and I focus on how exponential technologies are reshaping
geopolitics and society. |
16:44 |
Nina
100% |
Specifically,
with deepfakes, I have advised global leaders, including Anders Fogh
Rasmussen, the former NATO secretary general, and Joe Biden, who at the time
was VP, now President of the United States. |
16:54 |
|
When
you consider deepfakes, as well as the epidemic of bad information that is
inundating our societies, they absolutely can be seen as an existential
threat to liberal democracies. And we're already at
a place where a single creator, right – you interviewed him for your
documentary – Chris Ume, |
17:06 |
Fake
Tom Cruise videos |
can
create fake video content that's more sophisticated
than anything a Hollywood studio would be able to do, even with a
multi-million-dollar budget or teams and teams of special effects artists. |
17:25 |
Nina
100% |
And
this democratisation is happening so quickly that experts suggest that by the
end of the decade – so 2030 – up to 90% of online video could be
synthetically generated, so, made by AI. |
17:37 |
Police
clash |
Even
before they become ubiquitous, what they're already
doing is undermining trust in all authentic media. This is a phenomenon known
as the Liar's Dividend. And we've already started to
see that at play. |
17:55 |
Nina
100% |
So if
everything can be faked including video, then everything can be denied. |
18:08 |
George
Floyd protests news report |
News
Reporter: "The four officers involved have been fired, but protestors
are demanding criminal charges." |
18:12 |
|
NINA
SCHICK: One of the really interesting examples of
the Liar's Dividend at play is something that I saw around the George Floyd
death video. This video was so visceral that it united millions of people in
protest, right? |
18:17 |
|
News
Reporter: "The violence, fuelled by anger over the death of 46 year old George Floyd. The fire department reports that
he showed no signs of life." |
18:34 |
Winnie
Heartstrong video |
WINNIE
HEARTSTRONG: "Hi there I’m Dr. Winne Heartstrong. I'm running for
Congress to represent Missouri…" NINA
SCHICK: Doctor Winnie Heartstrong, she argued that the entire George Floyd
video was a deepfake hoax. |
18:42 |
Nina
100% |
That
the man you see in the video is a former NBA player. And that AI has been
used to swap George Floyd's face onto this former NBA player's face!" |
18:47 |
Winnie
Heartstrong video |
WINNIE
HEARTSTRONG: "There is no way that George Floyd is dead; George Floyd is
alive America! Prove me wrong! |
19:02 |
Nina
100% |
NINA
SCHICK: Most people still believe that George Floyd died in 2020. |
19:08 |
George
Floyd news report |
However,
if your influencer of choice suggests that something like the George Floyd
video is a deepfake, then by 2030 |
19:12 |
Nina
100% |
many
people might believe that that is indeed the truth. So, that is the Liar's
Dividend in play. |
19:22 |
Winnie
Heartstrong video |
WINNIE
HEARTSTRONG: "So folks, do not believe anything you read or hear or
watch on TV, okay?" |
19:27 |
Represent.Us
deepfake parody. Putin |
FAKE
PUTIN: "America, you blame me for interfering with your democracy, but I
don't have to. You are doing it to yourselves." HAMISH
MACDONALD, Presenter: These deepfake
parodies, were commissioned by the non-profit Represent.Us, |
19:31 |
Represent.Us
deepfake parody. Kim |
in
the lead up to the 2020 election, to highlight the fragility of democracy and
encourage voting. But the group says they were so realistic, American TV
networks refused to run them. |
19:47 |
|
FAKE
KIM JONG-UN: "I don’t have to do anything – you’re doing it to
yourselves." HAMISH
MACDONALD, Presenter: But what happens
if a deep-faked leader not only looks real, but mouths far more dangerous
intentions? |
20:00 |
CU
Twitter screen. POTUS |
This
is the scenario that keeps former CIA expert Matt Ferraro awake at night. |
20:13 |
Ferraro
100% |
MATT
FERRARO: The most plausible nightmare scenario involves a deepfake of the
president, say, announcing a missile strike against North Korea. So let's walk
through that. |
20:20 |
US military exercise in South Korea |
Tensions
between the U.S. and North Korea are always high. But let's
say it's time to come out at a period of particularly difficult relations
between the U.S. and North Korea. Maybe during a military exercise that the
U.S. is doing with the South Koreans on the Korean Peninsula. |
20:29 |
Creation
of deepfake Biden |
Say
that it's a deepfake of President Biden announcing
that because of North Korean perfidy, he is launching a first strike missile
attack, and it's believable, it's him in front of a podium. |
20:47 |
Deepfake
Biden |
DEEPFAKE
JOE BIDEN: Tonight I come to talk about crisis, to
declare war on North Korea. Nuclear program presents serious threats to
American security and the security of the world. We will respond. |
21:08 |
White
House Twitter feed |
MATT
FERRARO: But also imagine that it doesn't just
appear on the internet, maybe it's combined with another kind of attack. Like
a cyber intrusion, maybe for months now, hackers have been able to infiltrate
the White House Twitter handle. |
21:23 |
Fake
Biden video |
They
post this deepfake video and it goes viral instantaneously, in a moment. Tens
of millions of people see it. Now, of course, the White House is unsure what
happened; this totally blindsides them. |
21:37 |
Biden
model in front of green screen |
BIDEN
MODEL: We will respond accordingly. |
21:51 |
Deepfake
Biden |
DEEPFAKE
BIDEN: May God bless you all! May God protect our troops!" |
21:44 |
North
Korea military parade |
MATT
FERRARO: The North Koreans are 80% sure it's him,
maybe only 20% sure it's him. Are they willing to just sit back and take a
missile strike that will end their regime? Probably not. |
22:00 |
North
Korea. Missile strike |
They
launch a retaliatory strike against Seoul, which is only so many miles from
the North Korean border |
22:13 |
Ferraro
100% |
and
millions of people die. And it turns out it's all a
fake. |
22:22 |
Creation
of Deepfake Biden |
Chris
Ume, he said it took him a couple months to train the Tom Cruise algorithms.
But imagine if you're the intelligence service of
the People's Liberation Army, known as the 2PLA. They could put 10,000 man hours against the creation of a deepfake
tomorrow. |
22:26 |
Ferraro
100% |
And
so the fact that it takes 24 hours to create a of
video, really isn't that surprising at all, not when you have the resources
of a nation state. |
22:45 |
Biden
model in front of green screen |
Music
|
22:52 |
NY
Time-lapse |
|
22:57 |
Mounir
walk past UN building, NY |
MOUNIR
IBRAHIM: My name is Mounir Ibrahim. I was a US diplomat in Syria at the US
Embassy in Damascus, |
23:05 |
|
and
continued to work on the Syrian conflict in a variety of contexts, including
here at the United Nations. I saw user generated content, images
and videos from a conflict zone, influencing the discussions and the
narratives in the meetings taking place.
Proving that these images and videos were in fact true became very, very relevant to some of the world's largest issues. |
23:14 |
Detecting
Deepfakes video |
HAMISH
MACDONALD, Presenter: Detecting
deepfakes has become big business, with everyone from Facebook to the
Pentagon now working on the problem. Facebook
video narration: "So Facebook is commissioning a first of its kind data
set. It's part of the deepfake detection challenge." HAMISH
MACDONALD, Presenter: But Mounir
Ibrahim argues detecting deepfakes is a lost cause. MOUNIR
IBRAHIM: It is going to be impossible to detect forgery |
23:40 |
Mounir
100% |
in
real time and at scale across the internet, particularly with the
advancements of AI moving so rapidly. So if we can't
prove what's fake, let's try and prove what's real, by authenticating media
and digital content as it's being created. |
27:06 |
Mounir
walks. NY. Night. Looks at app/ o'lay conflict footage |
HAMISH MACDONALD, Presenter: His company, Truepic, is
developing an encrypted, tamper-proof security app, for recording vision. A
kind of watermark of authenticity. The system is
already being trialled by the United Nations. Truepic is part of a
growing coalition of tech and media companies now working towards creating a
new global standard – tamper-proof vision that can be trusted in the Deepfake
world. MOUNIR
IBRAHIM: The world's first standards body |
24:23 |
Mounir
100% |
on
digital content provenance was created by the name of the Coalition for
Content Provenance Authenticity, the C2PA, and Truepic is proud to be a
founding member along with the likes of Adobe and Microsoft and Intel and ARM
and the BBC. |
24:55 |
NY
time-lapse/GVs Times Square |
Music
|
25:10 |
|
HAMISH
MACDONALD, Presenter: Crowds in the
Big Apple still haven’t returned to pre-COVID
levels. Those who venture out share the visual experience with those who
still prefer to stay home. The big tech players – Apple, Google, YouTube have implemented bans on malicious deepfakes.
But they’ve got no control over the dark web. US Federal and state law makers are also moving to
combat the deepfake threat. But will it be enough? |
25:35 |
|
MOUNIR
IBRAHIM: I do believe that the US government is taking it very seriously. We've talked about the FBI warnings that have come
out. We've
seen multiple states passing deepfakes related laws. |
26:11 |
Mounir
100% |
So I
do believe there is a level of seriousness. I would also say this is not a
simple solution that a government can fix. They can't
legislate their way out of this. |
26:21 |
Lincoln
memorial. Night. |
|
26:32 |
|
FAKE
OBAMA: "How we move forward in the age of information |
26:38 |
Fake
Obama |
is
going to be the difference between whether we survive, or whether we become some kind of fucked-up dystopia!" |
26:43 |
|
NINA
SCHICK: Actually, I am really optimistic. I believe
in our ability to overcome these challenges. |
26:49 |
Nina
100% |
But
I think realistically, it will probably get worse before it gets better. |
26:58 |
Sydney.
Night. |
NOELLE
MARTIN: Well, I think it's terrifying, |
27:04 |
Noelle
interview |
especially
for women who are disproportionately targeted by fake pornographic material that's out there. It affected me so badly emotionally, in
my wellbeing, and in my health. |
27:09 |
Noelle
nods |
HAMISH
MACDONALD, Presenter: It does sound
like it’s changed your entire life? |
27:23 |
Capitol
building/Biden model |
MATT
FERRARO: I think that we've only really started to
scratch the surface of the bad things that can happen |
27:30 |
Ferraro
100% |
because
of deepfakes. Because we're going to move to a world
in which photographic and video evidence is a bit like paintings. Where you
look at a painting now, an abstract painting, and you and I take our own
meanings from that painting. |
27:41 |
Fake
Tom Cruise. Concession video. |
FAKE
TOM CRUISE: "Some people claim I’m not actually Tom Cruise, and I didn’t actually run for President…
Don’t let my masks fool you!... Take a deep
breath." CHRIS
UME: People are always scared of things they don't understand
or they don't really know. FAKE
TOM CRUISE: "We should be safe down here." |
27:55 |
|
CHRIS
UME: They should be worried, as in we should work on ways to regulate these
things and to detect them as well. I think that makes sense. So that's why I think it's a good thing I created these
videos, |
28:23 |
Chris
100% |
because
now I'm raising awareness and they realise, this is
real; it's coming! |
28:36 |
Fake
Tom Cruise. Concession video. |
Music
|
28:42 |
Credit
[see below] |
|
29:00 |
Fake
Obama |
DEEPFAKE
OBAMA: Thank you, and stay woke, bitches. |
29:23 |
Outpoint
|
|
29:28 |
CREDITS:
Presenter
Hamish Macdonald
Producer/Writer
Mark Corcoran
Editor
Peter O'Donoghue
Research
Jill Colgan
Camera/Production
Mikael
Enlund
Nopakorn Theinsiri
Marc Laban
John Biggins
Andy Kuester
Daniel Loh
Marton Dobras-Vincze
Brad McLennan
Archival
Research
Michell Boukheris
Audio
Engineer
Adrian Sardi
Assistant
Editor
Tom Carr
Graphics
Andrés Gómez Isaza
Special
Thanks to
Synthesia IO
Descript
The Electric Lens Co.
Brian Pentecost
Additional
Vision
Chris Ume/Miles Fisher/Stephen Vitale
Jordan Peele/Monkey Paw Productions
RepresentUs
FaceBook
Ted X Perth
The White Helmets
Senior
Production Manager
Michelle Roberts
Production
Co-Ordinator
Victoria Allen
Digital
Producer
Matt Henry
Supervising
Producer
Lisa McGregor
Executive
Producer
Matthew Carney
Foreign
Correspondent
abc.net.au/foreign
©
2021 Australian Broadcasting Corporation