Inside Facebook: Secrets of a Social Network
A fly on the wall inside Facebook's moderating HQ
Facebook has over two billion users, all producing and sharing content. Most of it is harmless, but some of it is not. How does Facebook regulate content encouraging child abuse, animal cruelty, self-harm and hate speech? This alarming documentary offers unique undercover access to Facebook’s moderating hub. It presents a disturbing picture of an organisation putting money before morality and for whom extreme content equals extreme profits.
In training at CPL Resources - a Dublin based content moderation contractor that has worked with Facebook since 2010 - the same video is shown as an example of content that should be marked as disturbing, meaning it remains on the site, but is restricted to certain viewers. A moderator at CPL explains to an undercover reporter posing as an employee that “if you start censoring too much then people lose interest in the platform…It’s all about making money at the end of the day”.
Once working as a moderator, the reporter encounters images of self-harm, violence between school children and animal cruelty. A video of a man eating live baby rats is marked as disturbing because the content is “for feeding purposes”. One user’s image of graphic self-harm is left online because it was adjudged not to actively promote self-harm.
Richard Allan, Facebook Vice President of Global Policy, denies that shocking or extreme content makes Facebook more money. “Shocking content does not make us more money. It’s just a misunderstanding of how the system works”. Yet Roger McNamee, an early investor in Facebook, claims that Facebook keeps extreme content because it encourages engagement with the platform. “What Facebook has learned is that the people on the extremes are the really valuables one, because one person on either extreme can often provoke 50 or 100 other people and so they want as much extreme content as they can get”.
A moderator acknowledges that the Britain First's page, a far-right group who had more than two million followers until the page was deleted in March 2018, was treated differently to others due to its large following. “We’d marked their pages so much for content, like so much, like they had like eight or nine violations and they’re only allowed five, but obviously they have a lot of followers so they’re generating a lot of revenue for Facebook”. For Roger McNamee, this goes to the core of Facebook’s business. “Once you understand that the nature of large internet networks is that the harshest meanest voices are going to dominate, what you realise is the more open you make the platform, inherently the more unpleasant, inappropriate, bad content you’re going to get on it”.