Or, rather, it will get a third party to do something about it.
Facebook said in a blog post that it will "work with third-party fact-checking organizations" to flag stories they deem untrustworthy.
"We believe in giving people a voice and that we cannot become arbiters of truth ourselves, so we're approaching this problem carefully," Facebook VP of News Feed Adam Mosseri wrote in a blog post. "We've focused our efforts on the worst of the worst, on the clear hoaxes spread by spammers for their own gain, and on engaging both our community and third-party organizations."
When a story on Facebook is flagged, it will get labeled with a warning sign "Disputed by 3rd Party Fact-Checkers." Users can click on the link to learn why it was signaled as fake. The story will also get pushed down lower on News Feeds, but the story itself will still remain on the platform and will still be shareable. Though, as Facebook notes in the blog post, it can no longer be promoted as an ad.
Based on the above image, sent by Facebook's press team, it also looks like you can inform your Facebook friends if you believe a story to be fake. For those who didn't unfriend politically unlike-minded friends in the wake of the election, this could provide a great opportunity to let them know that stories like "22 Secretive, Islamic Compounds Arm Up in Anticipation of Trump Administration" are totally fake.
Facebook's belated efforts to flag misinformation on its platform come shortly after the social media giant made it easier for users to report fake news. Outsourcing its fake news problem to outside parties affords Facebook the option to (maddeningly) maintain that it is not, in fact, a media company.
But the updates do mark Facebook's slow acknowledgment that maybe its influence on shaping its users point-of-view isn't "pretty crazy" after all.