Facebook’s Censorship Policy for Live Video


Facebook's Censorship Policy-Technosearch

There has been a lot of talk going around related to Facebook’s Censorship Policy related to its Live feature. Especially after the incident related to Philando Castile’s death where the Live video was temporarily unavailable.

Facebook-Live-TEchnosearch1In a statement made by the Facebook’s spokesperson mentioned that Facebook only removes content if it celebrates or glorifies violence, not if it’s only graphic or disturbing.

Further Facebook states that it was a technical glitch on the part of facebook as a result of which the video of Philando Castile’s death was temporarily unavailable. Facebook’s statement contradicts with the other assumptions going around suggesting that video disappeared as Facebook could not decide whether it should remain. The other theory was that a high volume of reports of it containing violent content. The police deleting it after taking custody of Castile’s girlfriend’s phone and Facebook account or a request from police to remove it.

To clear all the theories and doubts Facebook released this statement, however, it has refused to give details of what exactly caused the glitch. Though actions of Facebook have raised eyebrows amongst the internet community in general about Facebook’s role and responsibilities for hosting citizen journalism can be controversial or graphic.

Facebook’s Censorship Policy on Graphic Content 

Later in a conversation with Facebook Spokesperson, Josh Constine from TechCrunch discussed in detail on its exact policy of its Community Standards regarding graphic content, and when violations lead to censorship. Though the official refused to speak anything more than the official statement this is what we understood.

  • There is no option to report content as “graphic but newsworthy,” or any other way to report that content could be disturbing and should be taken down. Instead, Facebook asks that users report the video as violent, or with any of the other options. It will then be reviewed by team members trained to determine whether the content violates Facebook’s standards.
  • Even a single report flag sends the content to be reviewed by Facebook’s Community Standards team, which operates 24/7 worldwide. These team members can review content whether it’s public or privately shared. The volume of flags does not have a bearing on whether content is or isn’t reviewed, and a higher number of flags will not trigger an automatic take-down.
  • Essentially, if someone posts a graphically violent video saying “this is great, so and so got what was coming to them,” it will be removed, but if they say “This is terrible, these things need to stop,” it can remain visible.
    Users can report any content, including Live videos in progress, as offensive for one of a variety of reasons, including that it depicts violence.
  • The policy on graphic content is that Facebook does not allow and will take down content depicting violence if it’s celebrated, glorified or mocks the victim. However, violent content that is graphic or disturbing is not a violation if it’s posted to bring attention to the violence or condemn it.
  • Facebook’s Community Standards outline what is and isn’t allowed on the social network, from pornography to violence to hate speech. They apply to Live video the same as to recorded photos and videos.
  • There are three possible outcomes to a review.
    1. The content does not violate Facebook’s standards and is not considered graphic, and is left up as is.
    2. The content violates Facebook’s standards and is taken down.
    3. The content is deemed graphic or disturbing but not a violation, and is left up but with a disclaimer.
  • The black disclaimer screen hides the preview of the content and says “Warning – Graphic Video. Videos that contain graphic content can shock, offend, or upset. Are you sure you want to see this?” These videos do not auto-play in the News Feed and are typically barred from being seen by users under 18.
  • Live videos can be reviewed while they’re still in progress if reported, and Facebook can interrupt and shut down the stream if it violates the standards. Facebook also monitors any public stream that reaches a high enough level of viewers.
  • If Facebook’s team believes a person depicted in shared content is a threat to themselves or others, it will contact local law enforcement. It will also encourage users flagging the content to contact the authorities.

Facebook-Live-TEchnosearch2

Broadly speaking Facebook seems too lenient with its policies towards violence. Facebook’s Censorship Policy focus on glorifying the act of violence like videos posted to promote or celebrate terrorism. It also fails to differentiate the cause of death, the relationship between the video’s creator and its subjects or the involvement of police.As far as the ownership is concerned, all content posted on Facebook, the creator retains ownership.

 

You may also like...