Fb moderators ‘err on the aspect of an grownup’ when unsure of age in attainable abuse photographs


facebook stock art
Illustration by Alex Castro / The Verge

A serious accountability for tech firms is to observe content material on their platforms for little one sexual abuse materials (CSAM), and if any is discovered, they’re legally required to report it to the Nationwide Middle for Lacking and Exploited Youngsters (NCMEC). Many firms have content material moderators in place that evaluation content material flagged for doubtlessly being CSAM, and so they decide whether or not the content material needs to be reported to the NCMEC.

Nonetheless, Fb has a coverage that would imply it’s underreporting little one sexual abuse content material, based on a brand new report from The New York Occasions. A Fb coaching doc directs content material moderators to “err on the aspect of an grownup” after they don’t know somebody’s age in a photograph or video that’s suspected to be CSAM,…

Proceed studying…

Related Posts

Leave a Reply

Your email address will not be published.