These tech companies are finally standing up to hate — will it work?

Impact

Jason Kessler, the white supremacist organizer behind the racist August rallies in Charlottesville, Virginia, began getting the word out about his “Unite the Right” event back in July via posts on his personal Facebook page. A month later — just one day before the Unite the Right rally was to take place — Facebook removed the event page from its site for violating community guidelines.

But by that point, multiple white supremacist groups had already finalized their plans to travel to Charlottesville for the rally, as had other speakers and individuals sympathetic to its cause. And so the violence that took place, which included a targeted vehicle incident leaving one dead and 19 injured, had been, in some ways, foretold.

In the days since the racist rallies, however, Facebook has removed several white supremacist pages from its platform. These pages include “Right Winged Knight,” “Right Wing Death Squad,” “Awakening Red Pill,” “Physical Removal,” “Genuine Donald Trump,” “Awakened Masses,” “White Nationalists United” and “Vanguard America,” whose members attended the Charlottesville rallies, chanting their Nazi Germany-era slogan, “Blood and Soil.”

Reddit has also cracked down on hate. A spokesperson from the company confirmed to Mic that they banned /r/Physical_Removal, a subreddit dedicated to the “physical removal,” and in some cases the killing, of Democrats.

According to the Daily Beast, Reddit representatives defended the page just seven weeks ago, saying that what made Reddit’s platform “special” was that “people feel free to express themselves” on it.

In an emailed statement to Mic on Tuesday, having already removed /r/Physical_Removal, a Reddit spokesperson emphasized that this freedom of expression must adhere to the platform’s content policy, which condemns any speech that “incites violence.”

PayPal stepped up too, reiterating its acceptable use policy, which states that users are banned from using the service to process payments for “the promotion of hate, violence, racial intolerance or the financial exploitation of a crime.” The company also started a new email account for users to report any potential violations of this policy.

“Prejudice ... does not always march in the street,” a statement issued Tuesday by Paypal read. “Intolerance can take on a range of on-line and off-line forms, across a wide array of content and language. It is with this backdrop that PayPal strives to navigate the balance between freedom of expression and open dialogue — and the limiting and closing of sites that accept payments or raise funds to promote hate, violence and intolerance.”

Many of these platforms have been criticized before for not being vigilant enough — for allowing hate and bigotry to fester online for far too long. Some people attribute the resurgence of neo-Nazi and white supremacist hate groups in part to these online platforms. After Charlottesville, will they change?

Steve Helber/AP

Joan Donovan, a media manipulation research lead at the research institute Data & Society, said it’s well within these companies’ reach to implement changes that will curb white supremacist activity. And it’s something she said major platforms like Facebook and Twitter will have to confront as they acknowledge their role in magnifying hate speech and those who spout it.

“Richard Spencer might have a megaphone and his own website to communicate his messages of hate,” Donovan said in a phone interview Wednesday. “Now these platforms are realizing they are the megaphone. They are the conduit between him and larger audiences.”

Movements like the so-called “alt-right” aren’t just built on charisma, Donovan added — they’re built on infrastructure. The internet and all of its possibilities has now become a major part of that infrastructure.

It’s important to note that none of these platforms — Facebook, Reddit, Paypal — have changed their community guidelines or terms of use since this weekend’s events. Jillian York, the Electronic Frontier Foundation’s director of international freedom of expression, warned that making these policies too strict and narrow can end up backfiring. Queer people have found themselves blocked from Facebook for reclaiming words like “dyke” or “fag” on the platform; people of color could be blocked too, for writing things like “All white people are racist” on their pages.

“Privatized censorship,” as York referred to it, carries these risks.

York is against private companies regulating speech on principle. Still, she said these platforms have a responsibility to do more than simply remove hate speech. In a Wednesday phone interview, she called on these platforms to be “transparent” and “accountable” to their users.

“Censorship alone isn’t effective at solving hate speech,” York said. “They’re not trying to educate users about what hate speech is. Facebook, for example, just disappears the content. If someone in my circle is a Nazi, I want to know that.”

Donovan also acknowledged that suspending or banning users has its limitations. Members of these hate groups who are cast out from these mainstream platforms will find other ones, build them themselves or find some other way around the problem. The Daily Stormer proved this to be true after a story of theirs garnered over 65,000 shares on Facebook just after GoDaddy ousted the white supremacist site from its platform.

But that doesn’t mean these platforms shouldn’t do what they can to mitigate the rise of such hate groups.

“It’s important for platforms to pay attention to it and realize that they, like Charlottesville, are part of the battlegrounds,” Donovan said.