Why Twitter's attempts to end DM harassment continue to fall short
For many users, logging onto Twitter is a daily gamble. You might return to a calm newsfeed or your mentions and direct messages may be overtaken by abuse. Recently, Twitter launched its anti-abuse filter for DMs, but the platform's newest feature seems like yet another bandaid rather than a solution.
Back in August, Twitter began testing its filter for Message Requests, or DMs from people that you don't follow. It works by hiding messages that might contain offensive content behind a warning. You're given the option to delete a message without having to open it at all.
After a month and a half of testing, Twitter decided to officially roll out the feature. The company tweeted, "We tested, and turns out filters help you cut through the noise to find gems. Who knew. So we're rolling out this filter to everyone on iOS, Android, and web!"
It seems Twitter is on a roll when it comes to giving users options to hide unwanted messages on it site. Earlier this month, the platform also introduced its Hide Replies feature in both the United States and Canada.
However, neither of these features are actually doing anything to stop the abuse itself. That means for many of Twitter's users, marginalized people specifically, these features may be functionally useless.
If you're marginalized and on Twitter, you've probably faced harassment and abuse before. From receiving outright threats to having slurs brought into your mentions, the abuse varies, but it's always ugly. To organizations like Amnesty International, the level of abuse that some groups receive constitutes a human rights violation.
In December 2018, Amnesty International released a report looking into violence and abuse of women on Twitter. The report found that 71 percent of tweets sent to women were problematic or abusive. That statistic is alarming by itself, but it gets worse for women of color — especially Black women.
Amnesty International reported that "Black women were disproportionately targeted." They are 84 percent more likely than white women to be mentioned in abusive or problematic tweets.
"Online abuse against women on this scale should not and does not have to exist on social media platforms," Amnesty International wrote. "Companies like Twitter have a responsibility to respect human rights, which means ensuring that women using the platform are able to express themselves freely and without fear."
Along with rampant abuse of women, Twitter has seen a rise of members of the alt-right and white nationalism on its platform. In 2017, Vanity Fair reported on Twitter's verification of white nationalists like Jason Kessler, who helped organize the Charlottesville white-supremacist rally where protester Heather Heyer died. Twitter has taken steps to deplatform some of these people.
Part of why Twitter seemingly refuses to appropriately tackle abuse can be attributed to the same logic the company uses to claim that it has "no bias".
Last year, in a prepared statement for his appearance before the US House Committee on Energy and Commerce, CEO Jack Dorsey wrote:
"Twitter does not use political ideology to make any decisions, whether related to ranking content on our service or how we enforce our rules...from a simple business perspective and to serve the public conversation, Twitter is incentivized to keep all voices on the platform."
By refusing to acknowledge the role political ideologies like white nationalism can play in harassment, Twitter leaves the door open for members of the alt-right to continue utilizing its platform for abuse. Not only that, but people engaging with harassers can drive up ad views, and make the platform money.
Twitter's problem has become so rampant that TechCrunch even referred to the platform as a "Nazi haven". While Twitter's new Hide Replies feature and DM filter might make it a little easier to dodge some abuse, neither of them are actual answers for it.
It's time for Twitter to stop putting the onus on individual users to prevent their own abuse. As a platform, Twitter has a responsibility to re-evaluate the culture that allowed this problem to grow virtually unchecked.