|
CHILD ABUSE WARNING BOARD
|
SBS
ADVISES THAT THE FOLLOWING PROGRAM HAS REFERNCES TO CHILD ABUSE
|
1.
|
Rhiannon silhouette walking
Drone day 1 @ 00:03:42:20
See Rhiannon
|
SFX – Dial up internet
Rhiannon (00:48:36):
Sometimes you catch yourself walking down
the street, thinking that person could have seen that image of me.
(00:56:30) you
know, the, the imagery is, is just as bad as the, the actual physical sexual
assault.
|
2.
|
Day 3 drone @ 00:24:14:17
Drone sequence
|
During
COVID lockdowns police around the world have reported a massive rise in
people viewing and sharing child sexual abuse images on
line.
In the UK police estimate there are 300,000
men regularly watching and sharing these images
|
3.
|
Master Iv
|
SIMON - 00:35:28:07
The problem is, is huge. The notion that my
child is safe. My child is at home. My child's up in that bedroom. Live
streaming, cameras on phones has now taken away the front door completely
|
4.
|
Shots of computer screen with legal porn
|
Tonight….We'll hear from victims and perpetrators on a difficult
subject.
ROB
I've had to take on board that I
facilitated the ongoing abuse because of the looking and the sharing of these
images.
|
5.
|
Day
3 Drone @ 01:14
Evan PTC
Big drone shot looking over London slowly
coming in to see Evan on a rooftop
Cut to Evan on rooftop
|
Day 3 @ 00:36:10:08
PTC: In the 90s there were less than 10,000
images of child sexual abuse online - today that number has soared to more
than 47 million unique images
Day
3 @ 06:01
PTC: This is the story of some of the most vile images on the internet and the small group of
people who are trying to stop it.
|
6.
|
TITLE: DIGITAL
PREDATORS
|
SFX : The Sound of dial-up Internet @ 0:08 ( LOW VOLUME)
|
7.
|
FILE: Prolific hurtcore
paedophile jailed for 32 years
|
This
is Matthew Falder, one of the most prolific online
child abusers the UK has ever seen.
|
8.
|
Matthew being arrested
|
OFFICER:
The offences I’ve arrested you on suspicion of being involved in, are, black
mail, causing or inciting a child to engage in sexual activity, possessing
indecent images of children.
FALDER:
Sounds like the rap sheet from hell.
|
9.
|
Matthew putting a hidden camera in bathroom
- Prolific hurtcore paedophile jailed for 32
years.mp4
|
A
seemingly normal, intelligent and tech savvy individual….
Falder is the epitome of everything UK law enforcement is fighting in the
war against online
child abuse.
|
10.
|
Matthew in an interview room with police
|
0:55
OFFICER:
Have you sent pictures of your blackmail victims to the parents?
Falder: No comment.
|
11.
|
Use
O/lay from police video - Prolific hurtcore
paedophile jailed for 32 years.mp4
|
The
pursuit of people like Falder is well funded and
relentless.
And
it’s made the UK a world leader in policing this area of crime.
|
12.
|
SCOLT
|
|
13.
|
NORWICH DRONE
Wymondam drone 00:06:18:00
|
In Norwich -
the fight against child sexual abuse material - otherwise known as CSAM -
is led by a special team at the Norfolk constabulary
|
14.
|
UK WYMONDHAM
UK WYMONDHAM
Dave in Office getting ready
Camera follow Dave to printer
|
Detective Inspector Dave McCormack is the
unit’s senior officer.
Dave (14:06):
We're in the safeguarding children online
team, which we nicknamed SCOLT and there are sole
remit in here is to deal with online base sexual crime and to safeguard
children.
…
we'll deal with everything from this,
the more serious offensive, uh, child rape all the way through to downloading
or possessing indecent images of children.
|
15.
|
Dave walks over to Jenna
|
Today
he is briefing police investigator Jenna Saunders on a new case
|
16.
|
UK WYMONDHAM
Dave and Jenna chatting at desk
|
[23:46:36] Dave: right Jenna
[23:55:40] He is a suspect that we are
very well aware of, we have been to that address 3 times already. We already
know he's got a female partner, we already know there are at least 3 children at that address, so its fairly high
risk for us
[23:40:51] He is clearly escalating
his sexual behaviour and it’s something we need to get hold of him for.
|
17.
|
Footage of recent raid
SCOLT team @ offices
|
HIS TEAM IS DEALING WITH A GROWING NUMBER
OF REPORTS
AND PRIORITY IS GIVEN TO CASES WHERE THEY
BELIEVE CHILDREN ARE AT RISK IN THEIR OWN HOMES
|
18.
|
Evan and Dave chat in office
|
Dave (05:58):
You know, people don't seem to be getting
the message that uploading and downloading indecent images of children is
illegal. It's a, it's a huge growing area of business for us. Certainly within the last three years, we've just seen more
referrals and investigation skyrocket and that's the only word created to
describe it really.
|
19.
|
UK
NORWICH F5 @ 00:31:31:17
Joel
re-creation
Sequence
of perp
|
UK
WYMONDHAM F5 @ Dave (06:52):
Your average offender lives at home, has
his children at home and his family and have that normal life of go off to
work, you know, but then underneath there'll be a secret lifestyle that
nobody else knows about and feel as if they aren't going to get caught and
live out their front
their fantasies online and all the time they're fuelling that
sexual behaviour.
Evan (08:31):
does this lead to physical abuse?
Dave (08:40):
I think there's a real strong possibility
that it does lead to actual physical sexual abuse of children. I think they,
you know, like any addiction, you'll get to a certain point where you can't
get any more and when I need to manipulate others to get what you want. Um,
and that's where your grooming offenses come in. Once you make down the
grooming, you're arranging to meet offenses happen. Once you arrange to meet,
you're pretty much arranged to meet, to commit a physical sexual abuse
offense
|
20.
|
CHIEF
CONSTABLE SIMON BAILEY
|
|
21.
|
UK NORWICH F5
|
Simon (03:59):
The problem is, is huge, and I think it's
one of the greatest threats to society and one of the greatest policing
challenges, certainly that we are facing.
|
22.
|
Uk Norwich @ 00:47:45:21
Simon on Phone in courtyard
|
Chief Constable Simon Bailey is Norfolk’s
most senior police officer
He also spearheads the UK’s national
policing response to online child abuse.
|
23.
|
Evan and Simon in Crypt
|
Simon (10:55)
The latest statistics show that we are
resting over 700 men every month, every month, safeguarding 900 children
every month but the bottom line is the numbers of referrals and the number of
jobs continue to grow and grow and grow and grow
EVAN
(34:27):
What
do parents in Europe and Britain and Australia need to understand about the nature
of this sort of material?
SIMON - 00:35:28:07
The notion that my child is safe. My child
is at home. My child's up in that bedroom. We need to debunk that myth. And
we need to understand that what live streaming video cameras, cameras on
computers, cameras on phones, live chat rooms, live broadcasting capability
has now just taken away the front door completely and has afforded people
access to every child anywhere in the world, unless the appropriate
safeguards and security mechanisms are in place.
|
24.
|
RHIANNON
– SURVIVOR
|
|
25.
|
UK DAY 1
Drone
shot
|
When
Rhiannon was 13 she started an online chat with a
stranger.
SFX:
MS messenger “ping”
Years
later she still prefers to remain anonymous.
|
26.
|
REENACTMENTS
Rhiannon
on computer
|
Rhiannon (00:17:33):
They said that they were a young adult
female. In their early twenties. the first thing that you used to do was ask
ASL. So age sex location. And then we followed it up
with, um, a picture to sort of prove who we were she saw that I was a 13 year old girl and she sent me a picture of a young
woman that was supposedly her.
|
27.
|
REENACTMENTS
Rhiannon
on computer
REENACTMENTS
@ 00:09:54:21
Rhiannon takes picture of herself
|
Rhiannon (00:20:06):
So she said that she was a model. And after I'd sent the initial picture
of just my face, she said that I was really pretty. and that I could be a
model like her. eventually she asked for, um, like a full length photo, like to see what body type I had. And, and
then it progressed to,
|
28.
|
REENACTMENTS
REENACTMENTS @ 00:13:04:14
Door closes
|
Rhiannon (00:21:45):
She asked for a topless photo
Rhiannon (00:26:56):
I didn't want to do it at first, but she
spent ages telling me that it was, it was fine. It was okay. It was no big
deal. Everybody does it. Um, and she made me feel like it was okay to do it.
so eventually I did send her a topless photo and then everything changed.
|
29.
|
RHIANNON MASTER IV
Evan and Rhiannon
|
Rhiannon (00:28:16):
She asked me to, um, turn the webcam on. So they, it was, um, sort of a video rather than images
and asked me to do certain things. And, um, she said that if I didn't do
that, then she was going to make sure that all my friends and families saw
the topless photo of me. And which was, that was really terrifying
So I didn't really feel like I had much of a choice, but to do what she
was then telling me to do.
|
30.
|
RHIANNON WALKS TO WINDOW
UK
DAY 1 @ 01:30:00:03
|
The
woman asked Rhiannon to do more and more explicit sexual acts.
She
then asked to meet up.
|
31.
|
UK
DAY 1
Sequence: Looking out window / hands/ eyes/
rain on window
|
Rhiannon (00:31:21):
I initially said no to her. I said that I
wasn't going to give my address out. Um, and then she threatened
Me that she was just gonna
release all the material that she had.
Rhiannon (00:35:44):
So eventually I did give in and tell her my
address, cause I didn't know what else to do.
But
the person threatening Rhiannon was not a woman, and not a model – it was all
a lie
(00:39:30 I was too terrified to do or say
anything at that point. Um, and he, he sexually assaulted me at home and he took pictures of everything. ( HOLD ON THIS )
|
32.
|
IWF
ANALYST – Paul
|
|
33.
|
Drone
shot over “average” town
|
VO:
But there are some who look for explicit material of children being sexually
abused online on purpose…. As a job
They work from an unassuming office tucked away in a
quiet business park.
What
you are about to hear from one them is disturbing.
|
|
UK DAY 2 DRONE @00:22:32:19
Countryside, rows of houses, roads, IWF
Cover this entire grab with drone shots
End on drone pulling into IWF
UK
DAY 2 DRONE@ 00:06:24:10
|
Paul (01:55):
On a daily basis, we pretty much see all
levels of, of child sexual abuse. So, you know, from a, from a, a child 13 year old, just sitting in their bedroom, showing
themselves to a camera, sometimes they might be masturbating, there may be
penetration there, right up to, you know, babies being raped and tortured. We
pretty much see everything that you could possibly imagine and a lot that you
probably can't.
|
34.
|
Sequence of Paul walking into IWF
UK DAY 2 @ 01:20:29:19
|
Paul
is an analyst with the Internet Watch Foundation which has the sole aim of
taking down child sexual abuse material
|
35.
|
Sequence of Paul walking into IWF
|
Paul (07:40):
So if we identify it as child abuse, we'll create a report in our system
just to capture all the that report will then go to the country that hosts
that material. They'll then speak to their law enforcement, uh, and look to
get the content removed
|
36.
|
|
Each day IWF analysts receive hundreds of
reports from the public about inappropriate websites and Images.
|
37.
|
|
Paul (15:19):
We have seen, uh, uh, quite a large
increase in self-generated, uh, content. Um, uh, I I'm I'm,
I suppose we put it down to the fact that everyone's getting mobile phones
these days and, and mobile devices. Children of a lot younger age, uh, being
able to put themselves on the internet far younger,
um, and share those images.
|
38.
|
GFX
1
TEXT:
Almost half child abuse material
was
user generated
TEXT:
81% was girls between 11 and 13
|
In 2020, almost half the content reported
to IWF was user generated
majority of it showing young girls aged 11
to 13 years old
|
39.
|
Re-enactment with “digital predator”
|
Meaning
online predators are regularly coercing, blackmailing and tricking children
into uploading explicit material.
|
40.
|
Re-enactment with “digital predator”
Typing on phone
|
Paul (28:18):
There's certain sites where these people will find children online that are
live streaming themselves. Uh, and once they find them
they'll then go into chat rooms on the dark web or wherever, and then share
this stream URL. And they'll say, go to this site there's a child there now,
and then they all get on there. And then they make, you know, try and coax
the child into doing more and more things that they want them to do. Um, and
it, yeah, it is terrifying. Cause sometimes you hear the parents in the room
next door chatting them for dinner
|
41.
|
GFX
2
TEXT: In 2020 IWF removed 132,000 child
abuse webpages
Pull the plug effect goes to black.
|
In
2020 the IWF removed 153,000 webpages of child sexual abuse. That equates to
millions of images going offline.
SFX
– pull the plug
|
42.
|
GFX 3
TEXT:
35% increase abuse reports during COVID-19
TEXT:
IWF Blocked 8.8 million attempts to access
child abuse imagery
|
But
with more people at home since the global pandemic
there’s
been a 35% increase in public reports
And
during the UK’s first lockdown the IWF
Blocked 8.8 million attempts to access child abuse imagery
|
43.
|
ROB THE OFFENDER
|
|
44.
|
UK
EPSOM F5 CARD 2 @ 00:42:01:08
Evan TT walking through London on dusk
|
UPSOT:
traffic in central London
VO: I want to understand what drives
someone to access this kind material.
|
45.
|
00:53:53:19
Ev walking into a phone box
|
Under
the condition of full anonymity - I’ve arranged to speak with 32 year old – who we’ll call Rob.
He
was charged with possession and distribution of child abuse material and is
now on the sex offenders register.
|
46.
|
Evan in phone box
|
00:06:02:00
EVAN: I’m trying to understand, how did you begin watching child sexual abuse
online
01:02:35: I accessed adult pornography from
probably the age of 10 or 11 jjj
ROB ( 01:11:10)
So by the time I'm 18, 19, and I'm having conversations with people on
chat rooms, and I'm already accessing quite hardcore extreme but legal
pornography
ROB (00:09:48):
that led me to
ROB (01:15:10):
to conversations about young people and
then led to looking at images of young people.
|
47.
|
Evan
in phone box
|
[00:59:21] EVAN: When you were watching it,
did you ever feel this child have been abused to create this video? Did you
ever feel responsible that you were creating the market for this?
00:26:36):
when it comes to internet pornography,
there is a detachment and you don't believe it is
reality.
32:10
I have had to come to an understanding
that, you know, these, that any image or any video is incredibly real, it is,
I t is an example of abhorrent abuse that has taken place…..
I've had to take on board that I facilitated the ongoing abuse because of the
looking and the sharing of these images.
|
48.
|
Evan in phone box
|
[01:04:51]
Evan: how much of this material is out there? How easy is it to access?
ROB (01:31:28):
like unbelievably, rife. I don't think we,
as a society wants to realize how much material is out there.
|
49.
|
Evan in phone box
|
Rob
was eventually arrested with hundreds of illegal images in his possession.
|
50.
|
Evan in phone box
|
ROB (00:55:28):
I
lost a huge number of friends, I lost a huge number
of opportunities.
ROB (00:57:29):
This is the most socially taboo crime that
exists. I can understand why people view this crime as evil.
|
51.
|
LONDON GV’s
|
---SPACE---
VO: One way to prevent these these crimes, is helping people to stop viewing CSAM.
It’s
a strategy actively pursued in the UK.
|
52.
|
DONALD
FINDLATER
|
|
53.
|
UK
EPSOM F5 CARD 1 @ 00:15:14:11
Boring, brown, brick, building
UK EPSOM CARD 2
Counsellor on phone
Taking notes
|
SFX: phone ringing @ 00:19:05:19
23:59:51:21 – Stop it now – can I help?
Stop
It Now is a free hotline and counselling service for those worried about
their sexual thoughts or behaviours towards children.
00:00:53:17 – What bought you to calling us
today?
A surprising amount
of enquiries are from concerned partners or family members.
00:06:53:23 – what we find is that a lot of
people that look at images on lone, won’t have necessarily harmed a child.
|
54.
|
Donald chatting
|
Upsot: Donald talking
This call centre receives between 50 - 60
calls every day
But during the pandemic that call load has
increased by almost 50%.
|
55.
|
Donald and Evan walking in park
|
UPSOT: waking in park
Donald
Findlater is the Director of Stop it now and one of
the UK’s leading experts on Child Abuse prevention.
|
56.
|
|
Evan 09.44
I’ve always
thought that people accessing child sexual abuse material online were part of
that percentage of the population that you could call paedophiles.
|
57.
|
See Donald here
Re-enactment with Digital Predator
|
Donald (10:08):
They are mostly a little or a lot different
from what you're imagining. Um, the, the, the estimates across the globe are
about 1% of the adult male population are essentially paedophilic in their
sexual interest. That doesn't mean they're all acting on that interest, but
what are we seeing online is significantly different.
Donald (33:29):
I would say that most of the people that
we've worked with, um, have journeyed from adult pornography consumption to
looking at teen teenage images, to looking at the, at the next stage, down to
the next stage down. So they've gone down this
rabbit hole of interest and, and, and some of them realize that they're
shocked that they've done that and they will come out and they will not
return. And they'll be mortified by what they've seen, but some of them
unfortunately will get fascinated by that.
|
58.
|
|
And
for a large number, online viewing can lead to a desire for actual physical
abuse.
|
59.
|
|
Donald (17:50):
25% of the people we're now
that are contacting us now about online offending are talking about engaging
with children directly. So they're taking that next
step and therefore we have to help them to not take that step
|
60.
|
UK
EPSOM F5 CARD 1
Window EXT
00:38:52:05
hands on keyboard
00:39:43:09
Scrolling through stop it now website
00:47:07:11
|
Research estimates that for most convicted
abusers, there is a 10-year delay between first fantasising about children
and actually abusing them
Despite this window, there are no early intervention
services in Australia.
|
61.
|
SCOLT
+ Simon
|
|
62.
|
Wymondham drone 00:17:58:09
|
Back
at the Norfolk online child protection unit, officers have just made an
arrest.
|
63.
|
UK WYMONDHAM
Live briefing – with de identified officers
Actuality of briefing
|
48:45
FEMALE OFFICER:
Today we executed a warrant, um, suspect is
upstairs in his bedroom. Um, Sharee you arrested him for on suspicion of
distribution of indecent images.
(00:50:19):
FEMALE OFFICER:
He doesn't have access to any children and he doesn't have any children by himself. He
doesn't babysit anyone. So we don't have any
safeguarding concerns for any children
|
64.
|
Evan chatting to female cop in office
|
Leader 2 (01:02:09):
He has been, uploading, indecent images and
videos, to the internet, via what we call like cloud storage services,
things like Dropbox and mega, like iCloud, they're, all storage
accounts.
|
65.
|
Shots of Drop box website
GFX – to show CSAM
|
Because
these are foreign tech companies, UK police can’t force them to hand over the
files.
When
it comes to websites and social media platforms, many companies say it’s too
much data for them to know that CSAM is even there.
|
66.
|
Simon and Evan walking in hallway
UK NORWICH F5
|
Chief Constable Simon Bailey doesn’t buy
that. He believes they need to build better systems to identify online abuse.
Upsot: 00:43:25:06 you shouldn’t be able to host a site, that hosts that
material and say, there’s nothing we can do about it
Simon: You have a responsibility to police
what’s being facilitated. That cannot be the responsibility of law
enforcement staff.
He’s
been instrumental in helping create proposed new laws to make the UK one of
the safest countries online
|
67.
|
Multi
cam interview
Tech
company overlay
|
Simon (15:42):
Companies need to understand more so than
ever, ever before that they have a social and a moral responsibility to
protect children and to prevent the abuse of children. And the bottom line is
that they are not doing enough.
Simon (25:18):
The companies that we are talking about
have become embedded into our lives to the point we're actually, we're almost
prepared just to turn a blind eye turned around and say, well, that cop who
talks about these numbers, well, that just can't be right now. He's just
exaggerating. He's getting all emotional and emotive around it. Well, now I'm
not, I'm talking about cold, hard facts. And I'm talking about what I am
seeing. And my colleagues are having to see and view. I'm talking about
officers that are viewing the rape and abuse of a child, and then a week
later going into somebody's house and seeing that child and seeing the room
where they saw the week before being raped. Now that's tough and it's
happening. And we all need to reflect on the permissive nature of what we are
by supporting these companies are, I think, are allowed to take place
|
68.
|
|
Evan
(29:26):
So no matter how tough the laws might become in the UK, does anything
change until there's a global response?
Simon (29:39):
I would like to think that what would
happen is if we get it right here, then other countries around the world that
will be equally as horrified as we are. That will take the view. Well, if
they can do it, the UK, then we can do it too. And I would like to think that
would be a domino effect.
Simon (30:16):
I think making sure regulator has the, has
the power, has the teeth has the compellable powers that will say to
companies, right? We are now going to hit you with a huge, fine, or some form
of punishment. We're actually sit up and take notice because the voluntary
code of practice a voluntary code of practice simply has not worked. It's
failed.
|
69.
|
Tech overlay
|
So forcing tech companies to take down or close down this material under
threat of criminal sanction is the next step in the global fight against
child sexual abuse on line.
|
70.
|
SIMON BAILEY
|
10:50 you make it harder for people to
consume it and you make the deterrence along the way that great, then
naturally you will then start to see a reduction in it..
|
71.
|
Netherlands drone
Dutch parliament
|
The UK hosts less than 1% of the worlds child abuse material.
According
to the IWF -
the Netherlands hosts a staggering 71% making it Europe’s biggest hoster.
And
now, pressure is mounting on the Dutch Government, to follow the UKs lead to
pull their internet industry into line.
|
72.
|
ARDA – Netherlands
|
|
73.
|
Arda Master IV
|
Arda
Gerkans is a Dutch senator and head of the
country’s peak CSAM reporting body.
UPSOT:
Evan Can
you hear me?
|
74.
|
Arda
|
(00:56:1ll0):
EVAN: the IWF says that, 71% of the CSAM
material they found in 2019 was on Dutch servers.
ARDA : Yeah. Many people think that that's 71% of all CSAM material is found
on Dutch servers. So that's not true. It's what they found a 16.1 million
reports have been done last year to the United States. We've had 275,000
reports, including the ones from IWF. So you can see
the vast majority is in United States. Having said that we're coming in
second. So we still have a big problem. I don't want
to deny that.
|
75.
|
Drone shot of the Amsterdam
Fast cut establishing shots of NL
|
Super fast connection has made The Netherlands Europe’s internet gateway to the
world.
But
that means along with business, all kinds of online criminality gets funnelled through here too.
And
that includes a large number of websites hosting child sexual abuse material.
|
76.
|
ARDA Master IV
|
Evan: why are they not stopping it?
ARDA (00:24:37):
Well, they are, they're trying, but it's
very hard .we do not hold internet service
providers responsible for what's on their networks because you have millions
and millions of websites. So you cannot know. But
once you are notified and from that point on you are responsible and you
should act upon it.
ARDA (00:25:46):
And so far so
good. I think, uh, about 90% of the hotlines of internet service providers,
they act, and they even act within 24 hours or 48 hours, they take it down
very swiftly.
|
77.
|
GFX
4
Text: 99% of Child Sexual Abuse Material
Logos: Nforce, IP
Volume, leaseweb, KnownSRV
TEXT: NForce Logo
+ 90% 174,652 child abuse reports
|
A
recent report by the Dutch
Government
shows 99% of the child abuse imagery hosted in the country, is spread between
just 4 companies.
It
says, almost all of that is hosted by just one company - Nforce
- which received over 174,000 reports of CSAM from monitoring bodies in the first
half of 2020..
|
78.
|
GFX
5
TEXT: Take Down Time
TEXT: Nforce took
11 days
TEXT: IP Volume 21 days
TEXT: Source: Canadian Centre for Child protection
|
Over
the past 2 years Nforce took, on average, 11 days
to remove reported material
Whereas
another hosting company - IP
Volume - took 21 days
|
|
Shots of IP volume
|
The companies say they need to determine
whether it is actually illegal material.
|
79.
|
|
ARDA
(00:29:30):
They would say it's just the only one who
can decide whether this is child sexual abuse material or not is a judge. And
legally they're absolutely right.
|
|
GFX
|
In a statement to Dateline, IP
Volume said.
They refuse to work with Dutch CSAM reporting
bodies, because we get too many “false complaints”
And that recently, they have developed their own
methods to remove CSAM quickly.. and its working.
Nforce told us
“The company will continue to and always has, cooperated with the
government especially in the matters of fighting CSAM”
They say they are looking at ways to remove
material faster.
|
80.
|
NL 2 @W 00:00:00:23
NL2 @ 00:04:11:03
(Dutch parliament)
|
But that’s not enough for the Dutch
government which is considering new laws to fine companies 4 % of their
annual income if they fail to remove child abuse content within 24 hours of
it being reported.
|
81.
|
Arda
GVs
NL2 @ 00:33:20:18
Date one - this is a rouge hoster
|
ARDA (00:35:08):
That means that there's a going to be a
legislative body who will oversee whether internet hosting providers are
doing enough to prevent sexual abuse material will be uploaded or hosted on
their servers. And if they're not doing enough, they can be fined by, uh, by
this authority.
ARDA (01:10:36): We come to the
point where we can no longer say even as an internet hosting provider, I'm
not completely not responsible for anything
happening on my service. And either we do that with legislation and
where they don't do it, we should force them to do it
|
82.
|
|
AD BREAK
|
83.
|
RHIANNON
|
|
84.
|
Drone UK day 1
Day 1 @ 01:26:20:19
|
It’s
been a decade since
Rhiannon was abused by a digital predator
She’s still battling the trauma.
|
85.
|
Rhiannon walking to stairs - 02:00:12:02
Day 1 @ 01:33:37:16
Rhiannon sitting on stairs texting
|
Rhiannon (00:50:38):
Hmm it's um, it's changed who I am really.
When I look back now, it's like a dark cloud over my entire life for those 10
years. You know, it touched every part of my life. Um, I had panic attacks on
a daily basis.
Rhiannon (00:51:49):
I had depression and I, I used to self harm. Um, I took an overdose and tried to kill
myself twice. Um, you know, a had real anger issues.
I was just angry that this had happened to me and it never should have
happened.
|
86.
|
Day 1 @ 01:33:37:16
Rhiannon sitting on stairs texting
|
In
an age when children are uploading their lives, Rhiannon warns that parents
MUST engage with what their children are doing online
And
there’s something else, much more challenging.
|
87.
|
|
Rhiannon (00:53:16):
these perpetrators are adults, they're
grownups who are skilled manipulators, who can get people to do things that,
you know, without them even realizing that that's what they're doing, that
they're being tricked into doing something. And, you know, so young people
really need to know that they're not to blame. If I'd had that message
when I was younger, maybe I would have felt more able to speak out about it.
I think it's really important.
|
88.
|
Rhiannon walking away
|
There’s an urgency to her message.
…
it’s estimated that up to 85% of children abused online are never identified
by authorities.
|
89.
|
HELPLINES BOARD
Lifeline
Kids helpline
Marie Collins foundation
|
|
90.
|
TEXT: If this has raised any XXX for
viewers – please contact life line.
https://www.lifeline.org.au/
PH: 13 11 14
|
|