Inside Facebook: Secrets of a Social Network

A fly on the wall inside Facebook's moderating HQ

Inside Facebook: Secrets of a Social Network Facebook has over two billion users, all producing and sharing content. Most of it is harmless, but some of it is not. How does Facebook regulate content encouraging child abuse, animal cruelty, self-harm and hate speech? This alarming documentary offers unique undercover access to Facebook’s moderating hub. It presents a disturbing picture of an organisation putting money before morality and for whom extreme content equals extreme profits.

A video of a man beating a little boy was shared more than 44,000 times on Facebook within two days of it being posted. The video is still widely available on the platform. “He was hitting him and punching him, he was throwing him about and then he was stamping and kicking on him”. Nicci Astin, an online child abuse campaigner, has repeatedly complained to Facebook about the video, but they told her that it did not violate Facebook’s community standards.

In training at CPL Resources - a Dublin based content moderation contractor that has worked with Facebook since 2010 - the same video is shown as an example of content that should be marked as disturbing, meaning it remains on the site, but is restricted to certain viewers. A moderator at CPL explains to an undercover reporter posing as an employee that “if you start censoring too much then people lose interest in the platform…It’s all about making money at the end of the day”.

Once working as a moderator, the reporter encounters images of self-harm, violence between school children and animal cruelty. A video of a man eating live baby rats is marked as disturbing because the content is “for feeding purposes”. One user’s image of graphic self-harm is left online because it was adjudged not to actively promote self-harm.

Richard Allan, Facebook Vice President of Global Policy, denies that shocking or extreme content makes Facebook more money. “Shocking content does not make us more money. It’s just a misunderstanding of how the system works”. Yet Roger McNamee, an early investor in Facebook, claims that Facebook keeps extreme content because it encourages engagement with the platform. “What Facebook has learned is that the people on the extremes are the really valuables one, because one person on either extreme can often provoke 50 or 100 other people and so they want as much extreme content as they can get”.

A moderator acknowledges that the Britain First's page, a far-right group who had more than two million followers until the page was deleted in March 2018, was treated differently to others due to its large following. “We’d marked their pages so much for content, like so much, like they had like eight or nine violations and they’re only allowed five, but obviously they have a lot of followers so they’re generating a lot of revenue for Facebook”. For Roger McNamee, this goes to the core of Facebook’s business. “Once you understand that the nature of large internet networks is that the harshest meanest voices are going to dominate, what you realise is the more open you make the platform, inherently the more unpleasant, inappropriate, bad content you’re going to get on it”.

The Producers

Firecrest Films is an independent television production company based in Glasgow. They produce high profile documentaries, series and features for major broadcasters, including the BBC and Channel 4. Their content has won multiple awards and had record ratings, dominating the news agenda, in turn prompting resignations, arrests, questions in parliament and more than forty front page news stories. In January 2017 Channel 4 acquired a minority stake in Firecrest through its prestigious Growth Fund.

This site uses cookies. By continuing to use this site you are agreeing to our use of cookies. For more info see our Cookies Policy