YouTube's LGBTQ restriction isn't censorship. It's laziness.

Impact

There's an aphorism called Hanlon's razor, and it goes like this: "Never attribute to malice that which is adequately explained by stupidity." In the case of this weekend's YouTube censorship controversy, it wasn't malice that caused YouTube to block videos with LGBTQ content — it was pure laziness.

YouTube's "Restricted Mode" is a simple setting switch that works "like a parental control," according to the site. YouTubers began to notice that videos by LGBTQ bloggers suddenly disappeared when Restricted Mode was turned on, leading them to believe that LGBTQ content was being holistically labeled as explicit content not suited for young people. 

This forced YouTube to respond to the criticism. Sunday night, the verified YouTube Creators account tweeted that it is "proud to represent LGBTQ+ voices on our platform" and that those videos are "a key part of what YouTube is all about."

So is YouTube censoring LGBTQ content? Not exactly.

YouTube's Restricted Mode makes some decisions automatically, informed by masses of users clicking the "report" button. If enough people click "report," the item is marked as explicit. "We use community flagging, age restrictions and other signals to identify and filter out potentially inappropriate content," reads YouTube's page on Restricted Mode.

Naturally, it's not just LGBTQ content that gets swept up by erroneous reporting. If you look for YouTube's conservative firebrands, you'll find a similar effect: Turning on Restricted Mode wipes out videos from right-wing pundits like Paul Joseph Watson and Sargon of Akkad, whose clips have hundreds of thousands of views.

That's not to say YouTube's filtering is an effective measure. The point is, this problem isn't about direct censorship or a group of moderators making intentionally political decisions in secrecy. The problem is outsourcing decision-making to the crowd instead of defining clear standards enforced by individuals. The problem is an algorithm. The problem is laziness.

Bulent Kilic/Getty Images

These flubs happen all the time on your favorite apps. Apple's restriction on Confederate Flag imagery in the App Store led to the permanent ban of educational apps about the Civil War. It happened at Instagram, where in order to stop porn from promulgating across hashtags like #curvy, the app ended up silencing entire communities built around self-image and body positivity. It happened at Facebook, where crowd-sourcing community moderation led to the sudden removal of videos cataloguing instances of police brutality.

The problem is outsourcing decision-making to the crowd. The problem is an algorithm. The problem is laziness.

A potential solution would be to put into practice a set of values around what kind of content clearly does and doesn't belong, then have professionals moderate. In other words, tech platforms ought to develop a set of clear standards. That famous "I know it when I see it" definition of pornographic material, after all, requires a human there to see it in the first place.