Facebook bans far-right political party for inciting “animosity and hatred”

Impact

Facebook has banned Britain First, a far-right political party whose inflammatory anti-Muslim videos haven been shared by President Donald Trump, for violating the company’s community standards against hate speech.

The social media platform deleted three pages — the Facebook page for Britain First, as well as pages for Paul Golding and Jayda Fransen, two Britain First party leaders. In a statement released Wednesday announcing its decision, Facebook said the three pages had “repeatedly posted content designed to incite animosity and hatred against minority groups.”

“We recently gave the administrators of the pages a written final warning, and they have continued to post content that violates our community standards,” Facebook said. “As a result, in accordance with our policies, we have now removed the official Britain First Facebook Page and the Pages of the two leaders with immediate effect.”

Britain First, an ultranationalist fringe party that is often described as an extremist organization, had more than 2 million likes before it was removed from the platform, according to BuzzFeed, whose 2017 analysis showed how a small group of people who engaged with Britain First’s far-right content were able to spread the posts across the web.

Facebook said it takes hate speech seriously, and that sometimes “legitimate political speech crosses the line and becomes hate speech designed to stir up hatred against groups in our society.” This, it suggested, was one of those cases.

“We are an open platform for all ideas and political speech goes to the heart of free expression,” the statement continued. “But political views can and should be expressed without hate. People can express robust and controversial opinions without needing to denigrate others on the basis of who they are.”

In November, Trump shared three videos posted by Fransen that purported to show Muslims committing acts of violence, a decision by the president that was quickly condemned by political leaders in the U.S. and the United Kingdom. One of the videos purporting to show a “Muslim migrant” committing violence was found to be fake. The White House nonetheless defended Trump’s decision to share them.

Golding and Fransen were jailed in March for religiously aggravated harassment.

The decision to ban Britain First is a major move for Facebook, which has struggled to determine how to enforce its rules against hate speech and has been repeatedly criticized for how it classifies hate speech. In 2016, after Facebook CEO Mark Zuckerberg decided to allow anti-Muslim comments posted by Trump to remain on the platform, the company said content that violates its standards but is otherwise “newsworthy, significant or important to the public interest” will not be removed.

On Monday, prior to Facebook’s decision on Britain First, London Mayor Sadiq Khan — who has been the target of anti-Muslim sentiment from the far right — called on companies like Facebook and Twitter to take “greater responsibility” to stop the spread of hate speech, extremism and disinformation online.

In a statement shared to Twitter on Wednesday, Khan said Facebook’s decision to ban Britain First is a welcome step and urged other companies to follow suit.

“Big social media companies must wield the power they’ve amassed responsibility,” Khan said. “I call on social media platforms to show a stronger duty of care, so that they can live up to their promise to be places that connect and unify, not divide or polarize. … I trust the decision made by Facebook today reflects a genuine desire to do more to protect people online and I urge others to follow suit.”