Facebook has announced new tools and measures to prevent content that exploits children on its platform. The social media giant explains that the company is developing targeted solutions, including new tools and policies to reduce the sharing of such content. These include a pop-up alert for terms on Facebook-owned apps associated with child exploitation, while the other includes legal consequences for sharing harmful content in the form of videos and memes. It also highlights that the platform-guidelines have been updated to clarify the actions company will take against Facebook or Instagram accounts dedicated to sharing images of children posted along with captions, hashtags or comments containing innuendo or inappropriate signs of affection.
Antigone Davis, Facebook Global Head of Safety says that the social media company has also been working with the US National Center for Missing and Exploited Children (NCMEC) to understand the content related to minors that people share and consume, to form relevant guidelines and rules. "We found that more than 90% of this content was the same as or visually similar to previously reported content. And copies of just six videos were responsible for more than half of the child-exploitative content we reported in that time period," Davis said in a blog post.
The post further reads that Facebook has improved its user reporting flow for such violations, which will also see such reports prioritised for review. Platforms will use Google's Content Safety API to prioritise content that may contain child exploitation for content reviewers to assess. "We added the option to choose 'involves a child' under the Nudity and Sexual Activity category of reporting in more places on Facebook and Instagram. These reports will be prioritized for review," the company adds.
As mentioned, the company will show a pop-up when people search for child-exploitative content on its platforms like Facebook and Instagram. A photo highlights the pop-up with the search term pthc (pre-teen hardcore).