Being on the fence hurts social media users
By Nicholas Cleary
For far too long, Facebook has squirmed away from defining itself as either a platform or a publisher, instead opting to have the best of both worlds.
“Today, I announced that from now on we will treat speech from politicians as newsworthy content that should, as a general rule, be seen and heard,” Nick Clegg, Facebook’s vice president, said in a statement on Facebook’s newsroom webpage.
But this recent decision to exempt politicians from fact-checking and abiding by community guidelines stands in stark contrast to their recent banning of white nationalist and white separatist speech.
Facebook has neglected to take a clear stance on what they are, and it is becoming crucial that we force them to decide.
First, we must understand the vital difference between the two. A platform is a medium which merely allows other people to voice their thoughts and holds little responsibility for what the individuals who use it say or do.
A perfect example of this would be 4Chan: it has no moderators, no guidelines and takes no responsibility for what is said.
A publisher, however, reserves the right to mediate what is said on its outlet and has moderators which enforce a set of community guidelines.
The downside to this, from the publisher’s perspective, is that they can be held liable for what their users say. It is up to them to ensure that their outlet is not used to cause violence, share hateful views or display explicit content.
Facebook would like to be regulated as a platform but act as a publisher. They want few laws holding them accountable for what their users say but to still regulate those users as they see fit.
And therein lies the problem: allowing Facebook to operate this way undermines the very democracy Facebook claims to help protect. If Facebook wishes to control what their users say or do, including criticism of the platform itself, they inherently are a publisher, not a platform.
Currently, Facebook likes to dictate much of what its users can say and ban hate speech, but does not take any form of responsibility for what its users post.
For instance, the Facebook Live killer, who shot a man at point blank while livestreaming it on Facebook Live, brought up this debate.
Obviously, it is unreasonable for Facebook to moderate everyone’s livestreams, but at the same time, Facebook seeks control over what people can say on those livestreams.
At this point, it is clear Facebook is acting as a publisher. They take action against anything which violates their community standards. Having these standards is not inherently bad, but it can lead to abuse on the company’s part. Because of this, they do have some level of responsibility to the world to mediate what goes on their website.
It is time for Facebook to stop with half measures and own up to their responsibility.
You must log in to post a comment.