Facebook is getting a jump-start on spring cleaning.
The social network announced Tuesday that it's wiping out the garbage links (like false celebrity deaths, promises of free iPhones and viral hoaxes) that proliferate its users' News Feeds and make Facebook a faux-satire nightmare.
But don't worry — the Onion is safe.
How's it going to work? To do this, Facebook is giving power to the people instead of hiring a dedicated team to mindlessly flag false stories all day. An option will soon appear under the light grey arrow seen on every post with the option to report a "false news story." The more people report offending stories, links and pictures, the less frequently it will appear in News Feeds.
If a story is bombarded with a large number of flags, a message will pop up declaring that "many people on Facebook have reported that this story contains false information," CNN Money explains.
So the offending story doesn't disappear, but it is conveniently tucked under the rug for fewer people to see.
Why is Facebook doing this? The move comes on the heels a survey it published last year studying how fast those memes and viral hoaxes travel through Facebook. Now, it's taking what it learned and using algorithms to scrub out the nonsense.
TechCrunch says that each time those hoaxes gain traction (say that false Facebook copyright notice), it reflects poorly on Facebook. The tech blog writes: "People hold Facebook responsible for what's in the News Feed. They'll blame the authors, but their frustration leads them to visit Facebook less." Therefore, it's a "problem for business."
Trolololol. Of course, this plan is far from foolproof and could backfire on Facebook. Users (and trolls) could come together and abuse the new system to flag stories that they disagree with.
But Facebook claims that this self-refereeing of viral hoaxes actually works, per their research:
These types of posts also tend to receive lots of comments from friends letting people know this is a hoax, and comments containing links to hoax-busting websites. In fact, our testing found people are two times more likely to delete these types of posts after receiving such a comment from a friend.