Facebook came under fire for removing posts including the Pulitzer Prize-winning photograph of a nude girl running away from an attack in Vietnam, but when the BBC reported images of child abuse on the platform, the company neglected to remove them. Then, when the BBC sent the photos to Facebook as evidence of its failure to wipe them from the service, the company reported the journalists to the police.
Unfortunately for Facebook, it can't use an algorithm as a scapegoat for this decision.
Facebook's U.K. Policy Director Simon Milner said in an emailed statement to Mic that Facebook has "carefully reviewed the content referred to us and have now removed all items that were illegal or against our standards." He continued:
This content is no longer on our platform. We take this matter extremely seriously and we continue to improve our reporting and take-down measures. Facebook has been recognized as one of the best platforms on the internet for child safety.
It is against the law for anyone to distribute images of child exploitation. When the BBC sent us such images we followed our industry's standard practice and reported them to CEOP. We also reported the child exploitation images that had been shared on our own platform. This matter is now in the hands of the authorities.
Milner defended Facebook's decision to report the BBC to the Child Exploitation & Online Protection Centre because journalists from the media company technically distributed the images, which is illegal. However, the nuance here is important — Facebook's negligence to take down the photos put the journalists in a precarious position. Should the journalists have continued to use Facebook's faulty censoring feature that permitted the photos, or prove to Facebook that the photos were, in fact, still up and certainly a violation (which would require distributing said photos)?
It's an unprecedented choice — there's no handbook on what to do when the machines make a mistake. But if Facebook is so diligent about following the law, it should've removed the illegal distribution of child abuse photos on its platform ahead of reporting the journalists trying to eradicate them.
Facebook's reporting feature hasn't just failed at identifying inappropriate images, it also fails victims of harassment. Activist Shahana Hanif, who has endured persistent harassment on the platform, told Mic in September 2016 that she didn't believe Facebook would've dealt with her issues without a personal contact at the company — using the platform's reporting feature was ineffective.
"I was routinely contacted by friends and friends of friends with various issues they were dealing with," a former Facebook employee told Mic.