Over 1 Lakh Children Face Sex Abuse On Facebook, Instagram Daily: Lawsuit

10 months ago 4
ARTICLE AD BOX

An internal presentation by Facebook and Instagram parent Meta has estimated that 100,000 minors each day receive photos of adult genitalia and other content that depicts sexual abuse every day. The shocking fact came to the fore when newly redacted documents were presented before a court last month in a lawsuit filed by New Mexico, according to a report in Wall Street Journal (WSJ). The lawsuit alleges Meta's platforms recommend sexual content to underage users and promote underage accounts to predatory adult users.

The presentation was made in 2021, the outlet further said.

One of the documents from 2021 quoted Meta employees as saying that one of the platform's recommendation algorithms, "People You May Know", was known among employees to connect child users with potential predators.

The lawsuit claims that these findings were flagged to Meta executives several years ago, but they rejected the suggestion that that company needs to adjust the algorithm.

The algorithm is internally known by its short-form PYMK (People You May Know), WSJ said in its report citing the now redacted material.

In comments appended to the report, one employee at Facebook said that the algorithm had "contributed up to 75 percent of all inappropriate adult-minor contact".

"How on earth have we not just turned off PYMK between adults and children? It's really, really upsetting," another employee said.

Meta has not made any comment on the reference to these newly-unsealed documents, but told WSJ that New Mexico "mischaracterises our work using selective quotes and cherry-picked documents". The company called child predators "determined criminals" and added that it has long invested in both enforcement and child safety-focused tools for young users and their parents.

Another internal email from 2020 talked about prevalence of "sex talk" to minors was 38 times greater on Instagram than on Facebook Messenger in the US and urged the company to enact more safeguards on the platform, said the WSJ report.

A November 2020 presentation titled 'Child Safety: State of Play' said that Instagram employed "minimal child safety protections" and described policies regarding "minor sexualisation" as "immature". It further noted the platform's "minimal focus" on trafficking.

The New Mexico lawsuit is not the only one against Meta and its platforms. More than 40 other states in the US sued Meta in October last year, alleging that it misled the public about the dangers its platforms pose to young people.

On its part, Meta has said that it would start automatically restricting teen Instagram and Facebook accounts from harmful content including videos and posts about self-harm, graphic violence and eating disorders.

Read Entire Article