Are You suprised ?

Sweden. Facebook servers

Music

00:10

 

PETER GRESTE, REPORTER: In 2014, Facebook scientists published the results of one of the biggest psychological experiments ever conducted.

00:17

Time lapse busy street with GFX o/lay data graphics

 

They took almost 700,000 Facebook profiles and deliberately skewed their news feeds to be either more positive or more negative.  Then, they used the company's sophisticated algorithms to measure any shift in people's mood. Their findings?

00:24

People looking at phones

The more positive the feed, the happier Facebook users seemed to be.  The more negative, the more depressed they became. It proved the power of Facebook to affect what we think and how we feel.

00:49

Dreyfus 100%. Super:
Dr Suelette Dreyfus
Information Systems expert, University of Melbourne

DR SUELETTE DREYFUS: Facebook has very cleverly figured out how to wrap itself around our lives. It's the family album.  It's your messaging to your friends. It's your daily diary. It's your contact list. It's all of these things wrapped around your life.

01:02

Aerial FB headquarters from Friends Day Community Event/FB workers

 

01:16

 

PETER GRESTE: This is the story of how one of the world's biggest and most powerful private corporations is turning our lives and our data into vast profits, and in ways we have no control over.

01:19

Herrman 100%. Super:
John Herrman
Journalist, New York Times

JOHN HERRMAN: They are the most successful company, arguably, in human history at just gathering people's time and turning that time into money.

01:33

Still. Zuckerberg at computer

PETER GRESTE: Like his company, Facebook's founder hardly needs introducing. Mark Zuckerberg started the social media platform 13 years ago, when he was just 19, as a site for Harvard undergraduate students.

01:44

Archival. Zuckerberg aged 19

MARK ZUCKERBERG: When we first launched, we were hoping for maybe 400, 500 people. Harvard didn't have a Facebook so that was the gap were trying to fill and now we're at 100,0000 people, so who knows where we are going next.

01:59

Harvard University buildings

Music

02:11

 

PETER GRESTE: Within its first month, more than half the students had joined, setting the trend for the membership explosion that followed.

02:14

 

Now, almost a quarter of the world's population has signed on. It is bigger than any country.

02:21

Zuckerberg with pope and world leaders

Facebook is now a global colossus. It is one of the world's most valuable corporations, worth over 400 billion dollars. Mark Zuckerberg is an international powerbroker in his own right.

02:27

 

REBECCA MACKINNON: He's like a king, right. He's like a monarch. He's making decisions about your life on Facebook, what the rules are, and he's a benevolent dictator.

02:49

MacKinnon 100%. Super:
Rebecca MacKinnon
Internet Privacy Advocate, New America Foundation

You can't say this is accountable governance or a participatory governance in any particular way.

03:01

Time lapse busy street with GFX o/lay data graphics

 

PETER GRESTE: But almost two billion users still isn't enough for Facebook. Mark Zuckerberg is aiming for the next billion.

03:09

Time lapse girl on phone on street

There is a limit to how much they can grow in established markets like North America and Australia, but the 32-year-old businessman sees huge potential in the developing world.

03:18

Zuckerberg address

MARK ZUCKERBERG: There are a billion people in India who do not have access to the internet yet and if what you care about is connecting everyone in the world, then you can't do that when there are so many people who don't even have to access to basic connectivity.

03:33

Srinivasan 100%. Super:
Prof. Ramesh Srinivasan
Technology, Politics and Society, UCLA

PROF. RAMESH SRINIVASAN: There's a term that's being used ah by folks ah connected to Facebook and Google called the last billion where they're basically trying to figure out a way to spread internet access, but the internet that they're going to spread is an internet that's shaped by Facebook and Facebook's agendas. That's actually part of the long game here.

03:48

MacKinnon 100%

REBECCA MACKINNON: Most people in a lot of the developing world are accessing the internet through their mobile phones and there are these programs that are known as zero rating or Facebook Zero so when you get smart phones you get free data if you’re using Facebook and so people stay on Facebook.

04:08

 

They don't go anywhere else and so that their whole world on the internet becomes very much the same as, you know, they don't know any other kind of internet.

04:30

GFX:  Start on $ signs and pull out to reveal Zuckerberg

Music

04:46

 

PETER GRESTE: Facebook is a free service, but that's because Zuckerberg has learned how to turn our data into dollars. Lots of dollars. Last year his company earned 27-and a half billion US - just under 16 dollars for each user, and he's buying even more internet real estate.

04:49

File footage. Zuckerberg interview

INTERVIEWER: Clearly there's one topic we have to start with. You bought WhatsApp for 19 billion dollars. Why did you do it and what does it mean?

05:09

 

MARK ZUCKERBERG: You know, you can look at their messaging apps that are out there whether it's Cacao or Line or WeChat, that already are monetizing at a rate of two to three dollars per person with pretty early efforts and I think that shows if we can do a pretty good job of helping WhatsApp to grow, that is just going to be a huge business.

05:17

MacKinnon 100%

PETER GRESTE: Facebook has WhatsApp, Facebook has Instagram. Facebook has Oculus Rift, not necessarily mainstream, but these are very big corners of the internet. Should we be concerned about the monopoly?

 

05:34

Super:
Rebecca MacKinnon
Internet Privacy Advocate, New America Foundation

REBECCA MACKINNON: We should always be concerned about monopoly. We should always be concerned about concentration of power. We should always be concerned about that, and we need to hold their feet to the fire at all times.

05:48

FB Friends Day Community Event

MARK ZUCKERBERG: Facebook is all about community, right. It’s all about what people all around the world are coming together to do - connecting with friends and family, informing these communities.

06:02

File footage. Facebook Year in Review 2016

PETER GRESTE: Facebook presents itself as a digital platform - a neutral stage upon which life plays out. It says it is a company that develops digital technology, not social engineering.

06:12

 

For all the talk about community, Facebook is neither democratic nor transparent.

06:29

Srinivasan 100%

PROF. RAMESH SRINIVASAN: Any place we go to that is not truly open, that's not governed by us as users, that's not governed by some sort of democratic accountability, is actually a place that is not truly ours. It's a place that we can use, it provides great value in many ways, don't get me wrong, to its users. But it is incorrect to see it as a neutral place.

06:37

Herrman 100%. Super:
John Herrman
Journalist, New York Times

JOHN HERRMAN: It can it can do things like a government and indeed it has sort of inherited some government-like functions, but I don't think that passes the smell test, to imagine that that Facebook or any online platform is truly democratic, they're not.

07:01

Dreyfus writing on white board, demonstrating algorithms

 

07:16

 

DR SUELETTE DREYFUS: If we tell the computer to look at two numbers and compare them and put the larger number on one side and the smaller one on the other, then, with a series of steps we will be able to reorder it.

07:22

 

PETER GRESTE: To understand how Facebook works, we need to understand what goes on under the hood. The engine that drives the system is built on algorithms - sets of instructions that Facebook's engineers use to determine what we see in our News Feed. Dr Suelette Dreyfus, an information systems expert, demonstrates how a basic algorithm works.

07:34

Dreyfus 100%

DR SUELETTE DREYFUS: Typically, an algorithm might be for processing some data or doing some arithmetic, summing something for example, or it might be to try and recreate the decision-making process that we use in our human brain on a more sophisticated level.

07:56

Time lapse busy street with GFX o/lay data graphics

Music

08:13

 

PETER GRESTE: Facebook's algorithms were originally configured to help Harvard University students stay in touch with one another. They exploited the way the students had a small group of close friends, and a wider, looser social circle. The algorithms are now vastly more complex, but exactly how they work is a closely guarded commercial secret. We do know that they are designed with one aim in mind - to keep us online for as long as possible.

08:17

Dreyfus 100%. Super:
Dr Suelette Dreyfus
Information Systems expert, University of Melbourne

DR SUELETTE DREYFUS: The algorithms are designed to be helpful and give us information that's relevant to us, but don't for a minute assume that the algorithms are just there to help us. The algorithms are there to make a profit for Facebook.

08:46

GFX: Facebook video

PETER GRESTE: And that is Facebook's genius. It is a giant agency that uses its platform to deliver us advertising.

09:00

Time lapse busy street with GFX o/lay data graphics

Music

09:10

 

PETER GRESTE: By tracking what we do, who we associate with, what websites we look at, Facebook is able make sophisticated judgements about the stories we see, but also advertising that is likely to move us to spend.

09:16

Greste to camera at Times Square. Super:
Peter Greste

We will probably always live in a world with old fashioned display ads. Times Square simply wouldn't be the same without it. But these ads nudge towards products with all the subtlety of a punch in the nose. Facebook, on the other hand, uses the extraordinary amounts of data that it gathers on each and every one of us to help advertisers reach us with a precision that we've never known before. And it gives anybody in the business of persuasion power that is unprecedented.

 

 

 

09:34

People on mobile phones

Depending on what we post at any given moment, Facebook can figure out what we are doing and thinking, and exploit that.

ADAM HELFGOTT: Facebook's very well aware of, you know, our sentiment, our mood, how to talk to people and it can put all that data together and start to understand like who our exes are

10:00

Helfgott 100%. Super:
Adam Helfgott
MadHive Digital Marketing

and who our friends are and who our old friends are, who our new friends are, and that's how it really works to incentivise another post.

10:18

Greste with Helfgott at computer

PETER GRESTE: What you're saying is Facebook has the capacity to understand our moods?

ADAM HELFGOTT: Yes.

PETER GRESTE: Could that be used to influence our buying behaviours?

ADAM HELFGOTT: Of course it can be used to influence our behaviour in general, not just buying.

10:26

Brownlow 100%

MEGAN BROWNLOW:  You can be incredibly hyper targeted.  Can I give you an example?

10:39

Super:
Megan Brownlow
Media Strategist

We don't always act our age or according to our gender stereotypes. A middle-aged woman might like rap music. She is sick of getting ads for gardening gloves and weight loss. So she posts on her Facebook that she likes Seth Sentry's Waitress Song. Now she gets ads for a streaming music service, something she might actually buy.

10:45

Greste and Helfgott

PETER GRESTE: Adam Helfgott runs a digital marketing company in New York. He uses a tool called Facebook Pixel. Facebook gives it to advertisers to embed in their sites. They can track anybody who visits their site and target them with ads on Facebook.

11:03

 

ADAM HELFGOTT: Well if you've ever logged into Facebook it'll- and with your, with any of your browsers it- it's a good chance it'll know it's you.  You don't have to be logged in, you just- you have to have been there at some point in time and if it's a brand new computer and you've never logged into Facebook, you know, Facebook at that moment in time won't know it's you, but based upon, you know, their algorithms and your usage they'll figure it out.

11:22

 

PETER GRESTE: So, what you can then do is put this piece of script onto your website. And then use Facebook data to find the people that looked at your website and then target ads to them.

ADAM HELFGOTT: That's correct.

PETER GRESTE: Through Facebook.

ADAM HELFGOTT: Yep.

11:46

 

PETER GRESTE: That feel a little bit creepy. I mean is that -- are there privacy issues involved with that?

ADAM HELFGOTT:  I mean from a legal point of view there's no privacy issue, that's just you know, the internet today, and the state of it and using a product that generates a lot of revenue for Facebook.

12:04

 

Music

12:21

Brownlow at Price Waterhouse Coopers

PETER GRESTE: For advertisers it is a boon - giving them access to the most intimate details of our lives. Megan Brownlow is a media strategist for Price Waterhouse Coopers in Sydney.

12:25

Brownlow 100%

MEGAN BROWNLOW: When you change your status, for example, we might see something, a young woman changes her status to engaged. Suddenly she gets ads for bridal services. These sorts of things are clues about what her interests might really be. The research from consumers is they don't like advertising if it's not relevant to them. If it actually is something that they want, they don't mind it so much. So this is actually not a bad thing.

12:38

Cubrilovic in office at computer

PETER GRESTE: Nik Cubrilovic is a former hacker turned security consultant, he's been using his skills to investigate the way our data is tracked. One day Cubrilovic made a discovery that startled the tech world. He found that even if you're not logged on to Facebook - even if you're not a member - the company tracks and stores a huge amount of your browsing history. And you can't opt out.

PETER GRESTE: If you don't like Facebook, if you don't like the kinds of things you're describing,

 

 

 

13:10

Cubrilovic 100%

just close your account?

NIK CUBRILOVIC: It's very difficult to opt out of Facebook's reach on the web. Even if you close your account, even if you log out of all of your services the way that they're set up, with their sharing buttons, and so forth, they're still going to build a profile for you. And it's just not going to have the same level information associated with it. 

13:47

MacKinnon 100%. Super:
Rebecca MacKinnon
Internet Privacy Advocate, New America Foundation

REBECCA MACKINNON: They don't even tell us clearly what they're doing. They tell us some things but it, it's not specific enough to really answer the question, if somebody was going to build a dossier on me based on what Facebook knows about me, what would it look like? I should be able to know that so that I can make informed decisions about how I'm going to use the platform.

14:08

New York GVs

Music

14:35

 

PETER GRESTE: Facebook is not just influencing what we buy. It's changing the world we live in.

14:41

 

Music

14:47

MacKinnon 100%

REBECCA MACKINNON: Sure, they want to kind of bring their service to everybody on the planet; from a commercial standpoint that's obviously a goal. Whether it makes the world a better place is another question.

 

14:54

2011.

JOHN HERRMAN: Not only have you built this big business and this big social network, you now are determining the course of, you know, you are possibly determining the course of world events.

15:08

File footage. Cairo protests

PETER GRESTE: That's exactly what happened in the streets of Cairo. In January 2011, millions gathered in the city demanding the resignation of the autocrat Hosni Mubarak. It became known as the Facebook revolution. The organisers used Facebook to rally vast crowds of protesters. They were so effective that the government tried to shut down the internet. It took just 18 days to topple Mubarak.

PROF. RAMESH SRINIVASAN: So what Facebook came to stand for several months I would say or at least in its early days of after the events of Tahrir Square

15:19

Srinivasan 100%. Super:
Prof. Ramesh Srinivasan
Technology, Politics and Society, UCLA

and the Arab Spring, was a symbol of people's ability to organise and express and share information more widely. It symbolised that so much so that I like to tell stories about how I could buy T-shirts in Tahrir Square which said Facebook, Tool of Revolution.

16:16

Greste to camera on street

PETER GRESTE: I understand as well as anybody just how effective Facebook can be.

16:35

File footage. Greste Egypt trial

Three years ago, I was imprisoned in Egypt on trumped up terrorism charges. My family used Facebook as a way of organising supporters, and keep them informed. It became one of the most vital tools in the campaign that

16:40

Greste at airport after release

ultimately got me out of prison.

16:53

‘Freedom for Peter Greste’ Facebook page

The Facebook page became a place anybody could find the latest on our case. The underlying algorithms helped push it to people who might have been interested in supporting our cause, even before they knew it existed.

17:00

Greste greets Soltan on street

 

17:16

Greste and Soltan in cafe

Mohamed Soltan was also arrested for protesting. He was imprisoned in the same gaol as me.

17:33

Soltan 100%

MOHAMED SOLTAN, FORMER POLITICAL PRISONER: There was this medium that people just wanted to express themselves because there was no other avenue in the public space to express themselves and then they found this outlet and then they found this outlet to the outside world as well, where they would put how they feel about social justice issues, on just day to day inequalities that they've seen. And then there was the second phase of that where they saw that there's quite a few of them that feel the same way about a lot of different things.

17:40

Soltan trial

PETER GRESTE: It took a prolonged court case, a 500-day hunger strike and intense international pressure to get him released. For him too, Facebook became an invaluable tool.

MOHAMED SOLTAN: Facebook, unlike other platforms

18:10

Soltan 100%. Super:
Mohamed Soltan
Former political prisoner

and social media outlets, it allowed for us to put out the reports, the medical reports. It allowed for us to share our- my family, to share stories and it established credibility.

18:32

Herrman 100%

JOHN HERRMAN: So Facebook provides this place that is almost ideal for finding like-minded people, whether that means finding people who live in a certain place, who are interested in a certain thing, or people who are in the thrall of a dangerous ideology.

18:46

United Patriots Front Rally

PETER GRESTE: In Australia, Facebook has also become a powerful political tool for mainstream causes, and groups on the fringe.

19:04

Cottrell addresses rally

BLAIR COTTRELL: They don't want to represent you. They want to represent the great global international agenda which corrupts our people from the inside, builds mosques in our communities, floods our beautiful country with third world immigrants. But is that what we want?

PETER GRESTE: Blair Cottrell leads a group called the United Patriots Front - a right-wing movement built on Facebook, that campaigns against Muslims and immigrants across Australia.

19:17

Cottrell 100%

BLAIR COTTRELL: Facebook's been extremely effective for us. It’s indispensable to the development of our organisation. Without it, we would probably be a separatist cult, where no one would be able to relate to us, because no one would actually hear us directly, they would only hear about us through established media corporations.

19:47

Cottrell addresses rally

COTTRELL [addressing rally]: Islam can only pose a threat to our nation if our weak leadership, or rather complete lack of leadership, is allowed to continue.

20:08

 

PETER GRESTE: Facebook has helped turn a disparate group of individuals into political force that some say is dangerous.

20:18

Cottrell 100%. Super:
Blair Cottrell
United Patriots Front

BLAIR COTTRELL: It gives us the ability to cut out the middleman, to go directly to the people, to the audience with our message, to speak directly to the Australian people, which is a power that hitherto has only been held by established media corporations and anybody who speaks through such media corporations. But now anybody has that power. Anybody has access to that power.

20:27

Beheading video on United Patriots Front Facebook page. Text:
These anti-Islam campaigners are staging a mock beheading to protest their city’s plan to build its first mosque.
We’re just going to give you a bit of a taste of our own religious culture.

Allahu akbar! Allahu akbar!

 

 

 

 

 

PETER GRESTE: Some of the UPF's more inflammatory statements have been censored - he's been prosecuted for staging a mock beheading that they filmed and posted on Facebook. Facebook removed some of their posts, including the original beheading video, and threatened to suspend the page.

20:51

Cottrell 100%

BLAIR COTTRELL: Sometimes Facebook has removed or censored certain posts of ours because we've used the world Muslim for example. Not in a negative way at all. If we've explained an incident or a point of view and we've used the world Muslim, sometimes that registers in Facebook's computer and they automatically delete it, for some reason. I don't know whether it's a person who deletes it or a computer system, but that can be a bit problematic sometimes. We actually started altering the way we spelt Muslim in order to have our posts remain up when we were speaking about the Muslim people of the Islamic faith.

21:10

Montage of UPF

PETER GRESTE: Facebook has been criticised for the way it censors controversial posts. Whenever someone flags a post as offensive, it gets sent to a human moderator who decides if it should be taken down. The company says it reviews a hundred million pieces of content every month.

21:51

MacKinnon 100%. Super:
Rebecca MacKinnon
Internet Privacy Advocate, New America Foundation

REBECCA MACKINNON: People are under a lot of pressure to review a great deal of content very quickly. And I certainly hear about what appear to be mistakes quite frequently and some of them are kind of ridiculous like at the end of the end of 2015, a bunch of women named Isis had their accounts deactivated because clearly somebody went and flagged them as being terrorists.

 

22:12

Greste’s Facebook page

PETER GRESTE: After an Egyptian court convicted me of terrorism charges, my own Facebook page was suspended. We were never told why it disappeared. Facebook says it was to protect my privacy. We believed I'd been labelled a terrorist, violating what Zuckerberg calls its community standards.

22:42

Srinivasan 100%. Super:
Prof. Ramesh Srinivasan
Technology, Politics and Society, UCLA

PROF. RAMESH SRINIVASAN: You can't have a common standard for 1.8 billion people. Our diversity is actually our strength, right. Part of what makes us a global community is the reality of that what forms a global community are our incredibly fundamental differences.

23:07

Vietnam war photo. Naked girl fleeing

PETER GRESTE: In one infamous example, Facebook removed a post showing one of the most powerful images of the Vietnam war. The photograph of a naked girl violated its community standards.

23:22

MacKinnon 100%

REBECCA MACKINNON: The community standards are developed by his staff. The community didn't develop those standards. They're called community standards, but they were developed by Facebook and yes, they've had input here and there over time, they also get input from governments about, you know -- recently a number of governments told them you need to amend your community standards to be harder on extremist, content you know. And so they amended their community standards. It's not like the community got together and developed these standards.

23:35

View of snowy town from train

Music

24:10

 

PETER GRESTE: Facebook is also transforming politics as we know it.

24:15

Greste to  camera in train

Politicians have used social media for years of course, but in this last election campaigners used big data to radically transform American politics. In the words of some observers, they weaponised the data. We're on our way to Washington DC to find out what impact Facebook has had on the fight for political power.

24:19

Time lapse Washington. Night

PETER GRESTE:  At the heart of political power is information. That's why government security agencies go to extraordinary lengths to vacuum up data. But increasingly it is also becoming the key to winning power.

24:43

Schrader 100%. Super:
Adam Schrader
Former Facebook journalist

ADAM SCHRADER, FORMER FACEBOOK JOURNALIST: I think that there's, you know, a legitimate argument to this that Facebook influenced the election, the United States Election results.

24:58

 

I think that Facebook and algorithms are partially responsible, if not, you know, the main reason why, there's this shift toward hyper partisan belief systems these days.

25:08

Trump Campaign rally

 

 

 

 

 

25:21

 

PETER GRESTE: When Donald Trump became the presidential frontrunner in last year's US election, few pundits predicted that he'd actually win. One of the Trump campaign's secret weapons was an ability to research social media data in extraordinary detail. It helped him understand and target his voters with a precision we've never seen before.

25:29

Ruffini 100%. Super:
Patrick Ruffini
Republican Party strategist

PATRICK RUFFINI, REPUBLICAN PARTY STRATEGIST: By using Facebook's ad targeting engine for example, they know if some of those Independent voters have actually liked Republican pages or liked the Bernie Sanders page or like a Donald Trump page. So you can go to them to spend money, to target advertising specifically to those voters and it is a much more reliable ultimately form of targeting than many of the other online vehicles out there.

25:53

Greste with Ruffini in office

PETER GRESTE: Political strategist Patrick Ruffini runs a company that mines big data for the Republican Party. He produces social media maps that help them make sure their political messages hit their targets. PATRICK RUFFINI: What it does give us is much greater

 

 

 

 

26:21

Ruffini 100%

level of certainty and granularity and precision, down to the individual voter, down to the individual precinct about how things are going to go. It used to be we could survey eight hundred, a thousand registered voters nationwide, but you couldn't really make projections about understanding -- from that understanding how an individual State would go or how an individual voter would ultimately go.

26:37

Trump inauguration

DONALD TRUMP: I Donald John Trump do solemnly swear that I will faithfully execute the office of President of the United States.

27:01

 

PETER GRESTE: It is one thing to know your voters of course, and quite another to get them to change their minds. Facebook can help with that too.

27:17

Dreyfus 100%. Super:
Dr Suelette Dreyfus
Information Systems expert, University of Melbourne

DR SUELETTE DREYFUS: The ability to take the pools of big data that we've got and do really deep analysis of it to understand small groups of customers' preferences, can be applied in a political setting in a way that is potentially worrying, because it allows politicians to potentially lie better.

 

 

 

 

 

27:28

Cubrilovic 100%. Super:
Nik Cubrilovic
IT security consultant

NIK CUBRILOVIC: For instance, you can't make a political statement on television, without it being disclosed who it was being paid for. Those same controls on the web are very lax. For instance, I could see a story about a certain XYZ politician has done a great thing, produced on a completely different third party news site. And I cannot know that that ad was placed by a political operation who have specifically targeted me, because that information is not disclosed anywhere.

27:51

Srinivasan 100%. Super:
Prof. Ramesh Srinivasan
Technology, Politics and Society, UCLA

PROF. RAMESH SRINIVASAN: I am understanding yet troubled by the data driven advertising and targeting ads that occur, but I'm even more uncomfortable by the reality that our elections and how our elections are structured and configured can be hijacked by these forces that are not transparent to us.

28:17

Facebook pages on  laptop

PETER GRESTE: One of the most important parts of any democracy is news - almost half of all Americans get theirs from Facebook.  But the last US election also saw the explosion in fake news, turbocharged by sharing on Facebook.

28:41

Herrman 100%. Super:
John Herrman
Journalist, New York Times

JOHN HERRMAN: These things look like news, they function like news, they're shared like news, they don't match up with traditional ideas of what news is for and what it should do. Facebook is in the middle of this. They are the company that can see all of this and make judgements about it. I think they would prefer not to have to do that.

28:59

Schrader working on computer

PETER GRESTE: Adam Schrader is a journalist who used to edit stories for Facebook's Trending News section. Part of his job was to filter out fake news.

29:21

Schrader 100%. Super:
Adam Schrader
Former Facebook journalist

ADAM SCHRADER: Essentially, we operated like a newsroom, it was structured like a newsroom, copy editors had, you know, would make sure that the topics met standards, make sure that they were unbiased, checked facts.

29:34

 

There were often times that fake articles would appear and present themselves as possibly being a legitimate trending topic. And our job was, you know, identifying those and, the original term was blacklisting.

29:47

Reflection of Facebook page in spectacles

PETER GRESTE: In the heat of the campaign, right-wing commentators accused the team of bias. Facebook sacked it and handed the job to an algorithm.

30:02

Schrader 100%

ADAM SCHRADER: An algorithm cannot do the job of a trained journalist.  They don't have the ability to reason, right. Artificial intelligence hasn't gotten to the point where it can, you know, really function like a human brain and determine what has news value and what is good for the public and what is not.

30:13

 

PETER GRESTE: Schrader says after the team was sacked, fake news really took off.

30:32

 

ADAM SCHRADER: Yeah, after the Trending News team was let go, there was a big problem with sensational, or factually incorrect, or misleading news sources and trending topics.  It was just a disaster.

30:37

Brownlow 100%. Super:
Megan Brownlow
Media strategist

MEGAN BROWNLOW: The more partisan news sources you consume, the less likely you are to believe fact checkers or experts. And so, this can create some really dangerous divisive believers of alternative facts.

30:55

‘Pizza Gate Murder’ story screens

PETER GRESTE: During last year's election campaign, a news site published a story alleging that Hillary Clinton and her campaign chairman John Podesta were running a child sex ring out of the Comet Ping Pong Pizza restaurant in Washington DC.

31:11

 

The story was widely shared on Facebook. It was utterly fake, but it gained so much traction that a 28-year-old man finally went to the restaurant armed with an assault rifle, a revolver and a shotgun to find and rescue the children.

31:29

Archival footage. Comet Pizza

He fired several shots into the restaurant before police arrested him. The story still circulates on line and on Facebook.

31:52

Schrader 100%

ADAM SCHRADER: I don't think that, I trust the general public's ability to identify fake news, real news, you know, anything like that.

32:01

‘Fake News’ stories on Facebook

PETER GRESTE: One study found that in the closing months of the US election, Facebook users shared the top 20 fake news stories over a million times more than the top 20 stories from major news outlets. Fake stories are often written either for political advantage, or to make money.

 

32:16

Dreyfus 100%. Super:
Dr Suelette Dreyfus
Information Systems expert, University of Melbourne

DR SUELETTE DREYFUS: There are a lot of people out there who aren't journalists and who aren't publishers who are publishing. They don't have the same sense of obligation, so we are really in unchartered territory.

32:36

 

I think one of the most important things is that we actually need a big public debate about this because it’s changed the nature of news, and in doing so it's changed our reality as a society.

32:47

Facebook ‘How to report fake news’ video

 

32:59

 

PETER GRESTE: Mark Zuckerberg initially dismissed the notion that Fake News somehow skewed the election, but he is rolling out a system that allows the Facebook community to flag suspect stories.

33:06

Schrader 100%. Super:
Adam Schrader
Former Facebook journalist

ADAM SCHRADER: Mark Zuckerberg has said that, you know, he's not in the news publication business right like it's- they're not a media company. But I think that's a mistake, a kind of a denial, right. So, they're definitely a media company and I think that they should try and treat themselves more as one in the future.

33:17

New York newsstands

Music

33:32

 

PETER GRESTE: As more and more consumers get their news from digital sources and Facebook in particular the old-fashioned world of newspapers and TV stations is collapsing.

33:36

NYT exterior

Like most news businesses, the New York Times is struggling. Facebook sends plenty of readers to their stories, but most advertising dollars go to Facebook.

33:46

NYT newsroom

Newsrooms are shrinking along with the resources for serious journalism.

33:58

Herrman 100%. Super:
Journalist, New York Times

JOHN HERRMAN: You'll hear this from small publications and large ones that their absolute audience is larger than it's ever been and that surely has to mean something but it certainly hasn't meant profits.

34:03

 

I don't think there's a major news company in the world that that hasn't changed its operations around Facebook in a real way and I mean that both in the way that it produces stories and approaches stories and the way that it makes money.

34:16

People in Social Network graphic

 

PETER GRESTE: If Facebook is playing an increasingly important role in how we understand the world, its social mood study showed it affects how we feel about it. When its researchers explained how they manipulated the news feeds of some 700,000 users, they were criticised for playing with people's psychology without telling them.

34:31

Time lapse subway with GFX o/lay data graphics

 

And yet its algorithms do that every day. They give us stories they know we want to see, to keep us online, and help advertisers send us more ads.

 

 

34:54

Kross 100%. Super:
Dr Ethan Kross
Psychology Dept.,
University of Michigan

DR ETHAN KROSS: I don't think we can treat Facebook as benign. I think it has enormous implications for how we experience our lives, and how we live our lives, and I think that's the, simultaneously, that's one of the things that makes that network and others like it so phenomenally interesting and important.

35:05

Zuckerberg within Facebook network graphic

PETER GRESTE: But Mark Zuckerberg's plans would have Facebook do far more. He wants its users to rely on the network for the way we organise society, discover the world and conduct politics. He wants the community to inform Facebook, while Facebook watches over us.

35:27

Zuckerberg on Facebook’s Social Good Forum

MARK ZUCKERBERG: The philosophy of everything we do at Facebook is that our community can teach us what we need to do and our job is to learn as quickly as we can and keep on getting better and better and that is especially true when it comes to helping to keep people safe.

35:46

Zuckerberg addressing crowd

PETER GRESTE: In a post on his own profile in February, he outlined his plan for the company - a kind of manifesto for the central role he wants the platform to play in societies around the world.

MARK ZUCKERBERG: We're also going to focus on building the infrastructure for community, for supporting us, for keeping us safe, for informing us, for civic engagement and for inclusion of everyone.

 

36:01

Herrman 100%

JOHN HERRMAN: So it's a document that really felt like an attempt to take some responsibility but it wasn't apologetic. It was fairly bold and it seems to suggest that the solution to Facebook's problems is more Facebook.

36:25

Zuckerberg’s Facebook page. On screen text:
“Progress needs humanity to come together as a global community.”

“Help fight terrorism”
“more diverse content”

PETER GRESTE: Zuckerberg's has great ambition. "Progress needs humanity to come together as a global community," he writes.

Facebook can help build it. It wants to 'help fight terrorism', while its news service can show us 'more diverse content'.

36:44

Zuckerberg with Sarkozy

JOHN HERRMAN: He is behaving recently in ways more befitting of a politician than a CEO.

37:04

Herrman 100%. Super:
John Herrman
Journalist, New York Times

There's a lot of speculation that he may run for office and to my mind, if Facebook continues to be as successful as it is, not just through Facebook but through its other products, through Instagram and WhatsApp, if it continues to be so central in people's lives, he doesn't need to run for office, he will be presiding over platform and a venue where people conduct a real portion of their lives.

37:12

Montage of people using smart phones

PETER GRESTE: It seems there is almost no limit to Facebook's intrusion into our lives, for better or for worse.

37:38

Zuckerberg on Facebook’s Social Good Forum

MARK ZUCKERBERG: I want to thank all of you for joining us to hear more about some of the things we're working on at Facebook to help keep our communities safe.

37:48

Zuckerberg addressing crowd

PETER GRESTE: Zuckerberg convened what he called the Social Good Forum to help people whose Facebook posts indicate that they might be at risk of harming themselves or committing suicide.

37:55

Zuckerberg on Facebook’s Social Good Forum

MARK ZUCKERBERG: We're starting to do more proactive work. Like when we use artificial intelligence to identify things that could be bad or harmful and then flag them so our teams can review them. Or when someone shares a post that makes it seem like they might want to harm themselves. We can give them and their friends suicide prevention tools that they can share to get the help that they need.

38:06

Dreyfus 100%. Super:
Dr Suelette Dreyfus
Information Systems expert, University of Melbourne

DR SUELETTE DREYFUS: The ability to get big data, and to gather data about us in all aspects of our lives, creates a particular type of power among the people or organisations that have that data. So they can say, oh these are your daily habits. There's been some research done that nearly half of what we do is just repeating patterns of what we did the day before. And from that, you can also predict, potentially, how people will behave in the near future as well. And that's perhaps a little bit concerning for people who care a lot about their privacy.

 

 

 

38:25

Cubrilovic 100%.

 

Super:
Nik Cubrilovic
IT security consultant

PETER GRESTE: You use the word potential harm, that's a fairly big word. That's a fairly serious phrase. What sort of harm do you mean? 

NIK CUBRILOVIC: So there are a couple of factors here. First is the issues that we know about already. They are from little things, such as ad targeting, giving away what your girlfriend’s bought you for your birthday and doesn't want you to know about, through to a 14-year-old boy who hasn't come out as gay yet, and his parents discover the fact because of the advertising that's being targeted to him, through to potential future harms. One of the problems in the privacy realm is that we only have one identity, and we can't take back what we've already handed over.

39:07

Facebook pages

PETER GRESTE: Facebook is far more intimately involved in our lives than any company we've ever seen before.

39:43

 

REBECCA MACKINNON: Facebook has a responsibility to inform people of what is happening to their data and so then there can be a conversation also with "the community" about whether people agree that this is an appropriate use and right now they're not providing enough information for that conversation to take place.

 

 

 

39:53

Cubrilovic 100%.

 

NIK CUBRILOVIC: There's no take-back on private data.  The implications that are going to occur in five or 10 years' time, we need to protect against that now. And to an extent, it almost feels like - I'm reminded of Einstein's letter to Eisenhower warning about the potential dangers of nuclear holocaust or whatever, with the dangers in uranium - not to say that it's that severe, but we are at the point now, where we know that there's a danger, but we don't know the extent of it and we don't know the potential implications of it.

40:19

Graphic: start on figures rotating and pull out on to Zuckerberg’s face

Music

40:46

 

PETER GRESTE: Our lives are now measured in data.  What we look at, who we talk to, what we do is all recorded in digital form. In handing it to Facebook, we are making Mark Zuckerberg's company one of the most powerful in history. And the question is, at what cost to ourselves?

40:52

Outpoint

 

41:19

 

 

reporter

PETER GRESTE

 

producer

JANINE COHEN

LIN BUCKFIELD

 

editor

MICHAEL NETTLESHIP

 

assistant editor

JAMES BRAYE

 

 

camera

NEALE MAUDE

 

sound

RICHARD MCDERMOTT

 

Additional camera/sound

LOUIE EROGLU

GEOFF KRIX

OLIVER JUNKER

 

researcher

ANNE DAVIES

 

Additional research

ALEX TIGHE

 

archive producer

MICHELLE BADDILEY

 

Designer

LODI KRAMER

PETA BORMANN

 

digital producer

RUTH FOGARTY

 

Legal

DEBORAH AUCHINACHIE

 

publicity

SAFIA VAN DER ZWAN

 

 

sound mixer

EVAN HORTON

 

colourist

SIMON BRAZZALOTTO

 

post production

JAMES BRAYE

 

additional vision

GETTY IMAGES

POND5

FACEBOOK

CNBC

AP ARCHIVE

 

theme music

RICK TURK

 

titles

LODI KRAMER

 

program assistant

CLARE O’HALLORAN

 

production manager

WENDY PURCHASE

 

supervising producer

MORAG RAMSAY

 

executive producer

SALLY NEIGHBOUR

 

abc.net.au/4corners

 

© 2017           

 

© 2024 Journeyman Pictures
Journeyman Pictures Ltd. 4-6 High Street, Thames Ditton, Surrey, KT7 0RY, United Kingdom
Email: info@journeyman.tv

This site uses cookies. By continuing to use this site you are agreeing to our use of cookies. For more info see our Cookies Policy