Wikipedia Editors Call Out the Site's Abuse Problems
On the web, harassment is everywhere: including Wikipedia. A recent email published by one of the web encyclopedia's editors highlights the difficulty people face in social spaces online and why forums need to do more to ensure a safe community.
In an email Tuesday, one of Wikipedia's volunteer editors reached a breaking point.
"Right now I'm reaching out to anyone who might listen. I've been called obsessive, someone who attacks people, I've not been listened to and I've been lectured on policy by people who quote three letter shortcuts at me without reading the policy," the editor wrote to the Wikimedia-L email list. The Wikimedia Foundation is Wikipedia's parent organization. A story on the email was first published in Motherboard.
The editor's letter details his attempts to write articles for Wikipedia and the obstruction he felt he faced in doing so. After a disagreement with Wikipedia administrators that resulted in name calling, the editor was ultimately blocked from the site. At the end of the letter he says his experience has him considering suicide.
"I spent hours of my time researching the article, trying to do a good job. But in an instant the material was ripped away, and I was called obsessed," he wrote.
The editor in question is said to be OK, according to follow-up emails on the chain. While it's difficult to ascertain the validity of this editor's complaints, his words do appear to have struck a chord with others on the email chain.
In the wake of the editor's email, other contributors to Wikimedia's site piped up to say they too had felt obstructed or bullied on the platform.
"I've been called names, articles have been deleted, I've been told by many people that, sure, were it any other person they'd be banned," one contributor recounted, adding, "It's very, very toxic at times. And nobody really cares."
Moderating online forums is incredibly difficult. Toxicity in online social spaces is hardly an issue specific to Wikipedia. No virtual community is immune to negative commentary. Reddit, AltspaceVR, Twitter and Facebook all have issues with abuse and harassment.
According to the Pew Center for Research the bulk of online harassment tends to take place on social media networks. But forums like Reddit and large-scale crowdsourced projects like Wikimedia have very important social elements that may also serve as fertile ground for abuse.
In an interview with the Wall Street Journal, Massachusetts Institute of Technology professor and psychologist Sherry Turkle noted that without in-person interaction, it can be more difficult for people to figure out how to know what common ground they share. Online, it's easier to dehumanize other people, she said. When we meet online it's harder to know who we're talking to.
Wikimedia has an enormous cadre of 80,000 contributors, the organization said in an email. But its stock of editors is dwindling, according to a New York Times op-ed penned by one of Wikipedia's community members Andrew Lih. In his article, Lih notes that internal clashes may have been part of the reason for the drop off.
Civility is one of Wikipedia's five pillars, but like other networks it may have difficulty enforcing good behavior. Keeping communication between users, editors and contributors respectful may require a more attentive effort. In the meantime, the organization is focusing its efforts on keeping an ear to the ground and responding when emergency issues — like the one that took place Tuesday — arise.
The Wikimedia Foundation says its volunteer editors are on the front lines providing support to other community members.
"When they see something concerning, they can alert our Support and Safety team using our email hotline (emergency@wikimedia.org)," a Wikimedia spokesperson said in an email. "The team is entrusted with managing the safety and wellbeing of Wikimedia project users, with a 24/7 global on-call rotation to ensure rapid responses to emergency situations or other threats of harm."
Wikimedia Executive Director Katherine Maher, said in a phone call that though there are a lot of people who contribute to the platform, the foundation has insight into user interactions and takes action to ensure safety.
"The 80,000 people are not a monolith," she said. Contributors work on a variety of different projects within the foundation, each of which has its own system of oversight. Furthermore, she said she welcomes conversations about how Wikimedia can better serve its volunteers. "We encourage impassioned debate about what this community should look like in the future, I don't think this is what this story is."
Editor's note: For information about suicide prevention or to speak with someone confidentially, contact the National Suicide Prevention Lifeline at 1 (800) 273-8255 or the Crisis Text Line at 741-741. Both provide free, anonymous support 24 hours a day, seven days a week.