https://www.myjoyonline.com/facebook-moderators-err-on-the-side-of-an-adult-when-uncertain-of-age-in-possible-abuse-photos/-------https://www.myjoyonline.com/facebook-moderators-err-on-the-side-of-an-adult-when-uncertain-of-age-in-possible-abuse-photos/
Image Source: AP

A major responsibility for tech companies is to monitor content on their platforms for child sexual abuse material (CSAM), and if any is found, they are legally required to report it to the National Center for Missing and Exploited Children (NCMEC).

Many companies have content moderators in place that review content flagged for potentially being CSAM, and they determine whether the content should be reported to the NCMEC.

However, Facebook has a policy that could mean it is underreporting child sexual abuse content, according to a new report from The New York Times. A Facebook training document directs content moderators to “err on the side of an adult” when they don’t know someone’s age in a photo or video that’s suspected to be CSAM, the report said.

The policy was made for Facebook content moderators working at Accenture and is discussed in a California Law Review article from August:

Interviewees also described a policy called “bumping up,” which each of them personally disagreed with. The policy applies when a content moderator is unable to readily determine whether the subject in a suspected CSAM photo is a minor (“B”) or an adult (“C”). In such situations, content moderators are instructed to assume the subject is an adult, thereby allowing more images to go unreported to NCMEC.

Here is the company’s reasoning for the policy, from The New York Times:

Antigone Davis, head of safety for Meta, confirmed the policy in an interview and said it stemmed from privacy concerns for those who post sexual imagery of adults. “The sexual abuse of children online is abhorrent,” Ms. Davis said, emphasizing that Meta employs a multilayered, rigorous review process that flags far more images than any other tech company. She said the consequences of erroneously flagging child sexual abuse could be “life-changing” for users.

When reached for comment, Facebook (which is now under the Meta corporate umbrella) pointed to Davis’ quotes in the NYT. Accenture didn’t immediately reply to a request for comment. Accenture declined to comment to The New York Times.

Update March 31st, 9:09PM ET: Facebook pointed to Davis’ quotes in the NYT.

DISCLAIMER: The Views, Comments, Opinions, Contributions and Statements made by Readers and Contributors on this platform do not necessarily represent the views or policy of Multimedia Group Limited.


DISCLAIMER: The Views, Comments, Opinions, Contributions and Statements made by Readers and Contributors on this platform do not necessarily represent the views or policy of Multimedia Group Limited.