Facebook has released their own response to The Guardian leaking guidelines on how Facebook determines how to handle certain graphic content.
In the response, Facebook’s Head of Global Policy Management Monika Bickert said that while trying to stay objective when determining whether to allow or take down graphic content on their website is complicated, they want to keep Facebook “safe.”
“We don’t always share the details of our policies, because we don’t want to encourage people to find workarounds,” she said, “but we do publish our Community Standards, which set out what is and isn’t allowed on Facebook, and why.”
According to The Verge, graphic content like animal abuse is allowed on Facebook in order to raise awareness, but if content is deemed to be harmful in any way, moderators will take it down. Videos of child abuse are marked as disturbing but are also left alone in order to raise awareness.
Suicide is another issue when it comes to determining whether or not to allow it on Facebook. For example, Bickert said, there was a recent incident of a girl in Georgia who was planning to commit suicide over Facebook Live. Her friends were able to contact authorities and save her before she could do anything. Bickert states that experts have told them to use suicide as a form of awareness and to not “punish or censor” people, but to take down the content several weeks after in order to avoid copycat incidents.
Facebook plans on hiring about 3,000 more reviewers on top of the 4,500 they already have. Bickert says that the in doing this, there can be more clear definitions and a stronger ability to understand context on what should be left on Facebook and what should be taken down.
“This is demanding work, and we will continue to do more to ensure we are giving them the right support,” she said, “both by making it easier to escalate hard decisions quickly and by providing the psychological support they need.”
This leak comes a month after a Cleveland man shot another man in the head, recorded it, and posted the video onto Facebook. He then used Facebook Live to say that he was searching for more victims, alarming Facebook users. Steve Stephens, the shooter, later killed himself while being hunted by police. Facebook took down both his profile and the video of the incident, saying that their goal is to keep Facebook “safe.”
Facebook was also recently criticized for taking down a photo from the Vietnam War featuring children and soldiers who were sprayed with napalm. The photo, referred to as “Napalm Girl,” was taken by Nick Ut and circulated throughout the world press at the time. When a photographer posted it to his wall to discuss how photographs changed history, the image was pulled due to the girl being naked, according to The Verge. The person who posted the photo, Norwegian writer Tom Egeland, wrote a letter to Facebook about censorship. The Norwegian prime minister, Erna Solberg, posted the photo on her wall in order to protest Facebook’s actions. The photo was later removed by Facebook.
“Technology has given more people more power to communicate more widely than ever before. We believe the benefits of sharing far outweigh the risks,” Bickert said, “But we also recognize that society is still figuring out what is acceptable and what is harmful, and that we, at Facebook, can play an important part of that conversation.”
According to Facebook’s Newsroom, the website has 1.28 billion daily active users and 1.94 monthly active users as of March 2017. When Facebook chooses to censor certain images for the sake of these users has been debated.
Featured Image via Wikimedia Commons.