Working to prevent child abuse, Facebook aims to stop and control the abuse by integrating tools on its platforms.

Facebook is testing new tools aimed at limiting abusive photo and video searches to children and preventing the sharing of such content. Tested features include updated alerts, improved automatic alerts, and new reporting tools.

New alerts and detailed reporting with popup window
Facebook said it was conducting a study of all child abuse content it had previously identified and reported to authorities to identify the reasons behind this type of posting to improve its processes. The company said they detected at least 13 million malicious images from July to September in 2020 alone.

The first of the features includes a popup that appears when users search for terms associated with child abuse, and suggestions for people to seek help to change behavior.

Another tool aims to stop the spread of this type of content by informing users that it can disable their account that tries to share abusive content.

“Using our apps to harm children is disgusting and unacceptable. Our industry-leading efforts to combat child abuse are focused on preventing abuse, detecting and reporting content that violates our policies, and working with experts and authorities to ensure the safety of children,” said Antigone Davis, Head of Global Security, Facebook. made statements.

Facebook has also updated its child safety policies and reporting tools. The company added the option “Contains a child” under the “Nudity and Sexual Activity” category for faster intervention on Facebook and Instagram.

LEAVE A REPLY

Please enter your comment!
Please enter your name here