Dark subject matter but somewhat humorous in the way it played out:
Facebook has been criticised for its handling of reports about sexualised images of children on its platform.
The chairman of the Commons media committee, Damian Collins, said he had "grave doubts" about the effectiveness of its content moderation systems.
Mr Collins' comments come after the BBC reported dozens of photos to Facebook, but more than 80% were not removed.
They included images from groups where men were discussing swapping what appeared to be child abuse material.
When provided with examples of the images, Facebook reported the BBC journalists involved to the police and cancelled plans for an interview.
The BBC first asked Facebook for an interview about its moderation system in late-2015, and repeated the request following this follow-up investigation.
The social network's director of policy Simon Milner agreed to be interviewed last week, on condition the BBC provided examples of the material that it had reported, but had not been removed by moderators.
The BBC did so, but was reported to the UK's National Crime Agency as a consequence.
Facebook are in a difficult position. They want
to be a mere conduit of information no different to a phone or mail firm whereby users are entirely responsible for content they generate; whereas, because such content is often displayed publicly, they share a lot in common with a publishing
firm like a newspaper who are entirely responsible for what content of others
they let appear. In truth they are something in between - but the more they do to police peoples' content, the more they will be expected
(Also, Facebook seem to fail badly in both directions - eg. the 'free the nipple' campaign...)link