What constitutes animal cruelty? Most people have a pretty good idea. It seems Facebook policies on violence, non-sexual child abuse and animal abuse indicate a desire to allow freedom of expression whilst at the same time banning horrific images. However, their own moderators tend to remove content ‘upon report only’. This means that distressing content will be seen by many people before it is flagged and removed.
Facebook state that “Generally, imagery of animal abuse can be shared on Facebook” and “some extremely disturbing imagery may be marked as ‘disturbing’ as a warning to other users”. Facebook categorise ‘disturbing’ as ‘photos of animal mutilations’ or ‘videos only of the act of abusing an animal or repeatedly kicking or beating an animal or an act of torture resulting in serious injury to an animal’.
Facebook state they will “ignore, without viewer protections, photos of acts of abuse, human kicking or beating an animal, photos of bruised or battered animals, sick or starved animals.”. As well as using human moderators for supervising possibly alarming content, Facebook also uses A.I. algorithms to review images before they are posted.
Recently, the UK government’s Home Affairs Select Committee strongly criticised social media companies for not tackling the spread of hate speech and other illegal or distressing content. This comes after major brands including Verizon pulled their adverts after they found they were appearing next to videos promoting extremist views and hate speech.
After some of Google’s high-profile advertisers left, Google publicly apologised for their lack of content moderation and pledged to give brands greater control over their advertising in an attempt to win them back. “This marks a turning point for YouTube. For the first time, it’s dealing not only with reputation damage but revenue damage,” said Alex Krasodomski-Jones of Demos.
Facebook’s head of Global Policy states that ““We’re going to make it simpler to report problems to us, faster for our reviewers to determine which posts violate our standards and easier for them to contact law enforcement if someone needs help.” But the fact remains that moderation of content is lacking and you can still find articles on how to make a home-made improvised explosive device like the one used in Manchester.
Perhaps pressure from advertisers is the key to changing social media sites…
Follow the YouCom Media news posts to see the next developments.
Facebook graphic content policy sections 15.5 and 15.9.
YouCom Media News, May 2017, London, ‘Facebook Moderation’