The Castile Incident Shows Facebook Is Not Here to Protect Your Freedom of Speech

Impact

As Philando Castile lay bleeding to death in his car Wednesday night, his girlfriend used the only emergency lifeline available: She streamed the encounter live on Facebook. "He let the officer know that he had a firearm and he was reaching for his wallet and the officer just shot him in his arm," Diamond Reynolds told viewers, tilting the camera to show Castile in the driver's seat.

Just as the video started to circulate among loved ones, activists and the media, Facebook took it down, leading users to accuse the social networking site of censorship:

The video was reinstated about an hour later. A Facebook spokeswoman said it was removed because of a "technical glitch." "We're very sorry that the video was temporarily inaccessible," the company said in a statement.

But the removal of the video shows that the public cannot trust social media sites like Facebook to serve as a free, open marketplace of information. It is still a walled garden. 

Facebook is now the main source of news for many Americans, especially young people: 61% of millennials say Facebook is their primary source of political news. Given its status as a gatekeeper, users have come to expect Facebook to serve as an impartial arbiter of what we see and know, upholding freedom of speech in the same way that the Constitution requires the government to do. 

Stephen Maturen/Getty Images

Despite how important social networks have become in fueling social movements like the Arab Spring and Black Lives Matter, their responsibility isn't to upholding democracy. Facebook and Twitter — corporations with shares traded on the NASDAQ and the New York Stock Exchange — are responsible first and foremost to their shareholders; their ultimate goal is capturing our attention and keeping it so they can show us advertisements. The First Amendment doesn't protect a user's speech on a private company's site. On the contrary, the First Amendment protects Facebook's right to say what can appear on its platform.

"Every media platform makes editorial decisions," said Adam Holland, a project manager who works on internet censorship issues at Harvard's Berkman Klein Center, a research instittue that focuses on cyberspace.

Facebook can disclose as much or as little as it wants about its decision-making process. In practice, that often means it doesn't. We still have no idea why Facebook removed a photo of two men kissing in 2011, or why it once took down social media gossip site the Shade Room in April. Facebook has been just as cagey about what happened with the Castile video. 

"These companies have done a reasonable job of being accountable to government requests, but not to their own private terms of services," said Jillian York, director for international freedom of expression at the Electronic Frontier Foundation.

But we do know the primary ways Facebook flags and takes down content: automated algorithms and human moderators. How the algorithm deals with flagged comments is a mystery — anti-vaxxers, for example, are regularly accused of gaming the algorithm to get people with opposing viewpoints banned. Facebook employs thousands of moderators who sift through material to decide, based on its vague Statement of Rights and Responsibilities, what goes down and what stays up. Those terms include infractions as specific as directly inciting violence and as wide reaching as anything "misleading, malicious, or discriminatory." 

In many instances this mysterious process removes content many users would indeed find objectionable — pornography, harassing or racist material — but it is just as capable of removing sensitive content like the Castile video that the public has a right to see. 

Getty Images

Facebook's growing control over information, and its lack of transparency about how it deals with sensitive or political content, is attracting not only criticism from users, but government attention. The German government is pursuing an anti-trust investigation against Facebook, and after Facebook was accused of censoring conservative news among its trending topics, Republican senators demanded answers in a formal letter to Facebook CEO Mark Zuckerberg. "If Facebook presents its Trending Topics section as the result of a neutral objective algorithm, but it is in fact subjective and filtered to support or suppress particular political viewpoints, Facebook's assertion that is maintains a 'platform for people and perspectives from across the political spectrum' misleads the public," the letter said.

"Facebook is the largest of any group of people in the world except for Christianity and Islam."

Hypothetically, anti-trust legislation could establish more public standards for Facebook, or even a third party auditor to enforce transparency and impartiality. But Holland said any legislation that would demand transparency from Facebook is unlikely and unprecedented.

"They've got their users over a barrel, barring some sort of real zeitgeist shift where Facebook's reclassified as a public utility," Holland said. "But that's science fiction territory."

Our only accountability measure, for now, might be pitchforks, public shaming, and hoping that Zuckerberg cares about his reputation as an upstanding global citizen. A massive boycott of Facebook is unrealistic, but Facebook's own users reaching a popular consensus that the site is hostile to free speech could force Facebook to recognize its new role and hold itself to a public standard.

Getty Images

"People say 'if you don't like it, you just leave,' but Facebook is the largest of any group of people in the world except for Christianity and Islam," York said. "So ultimately, all we really can do is push them to be more transparent and hold them publicly accountable."

As the EFF's York told Mic, the greater public has yet to realize that Facebook's vital role in our democracy.

"We can publicly shame them and hold them accountable," York said. "But the public, by and large, doesn't see this as censorship yet."

Correction: July 8, 2016

Read more: