Instagram algorithms accused of facilitating the sale of child pornography

by time news

2023-06-08 23:26:05

The alert comes from Stanford University and the Wall Street Journal (WSJ). According to their report, Instagram is the main platform used by pedophile networks to promote and sell content showing sexual assaults on minors. “Large networks of accounts, which give the appearance of being operated by minors, openly promote the sale of content” child pornography, said Wednesday researchers from the Cyber ​​Policy Center of the prestigious University of Silicon Valley.

Meta’s subsidiary “is currently the most important platform for these networks with features such as content recommendation algorithms and messaging that helps sellers connect with buyers,” they added.

And neither pedophiles nor these networks need to show much ingenuity. According to the WSJ, a simple search for keywords such as #pedowhore (“fucking pedo”) or #preteensex (“pre-teen sex”) leads to accounts that use these terms to advertise content showing sexual assaults on minors.

Videos with bestiality and self-harm

Often, these profiles “claim to be driven by the children themselves and use overtly sexual handles with words like little slut for you “, details the article. The accounts don’t directly say they’re selling these images, but they do feature menus with options, including asking for specific sex acts, in some cases.

Stanford researchers also spotted offers for videos with bestiality and self-harm. “At a certain price, children are available for met in person,” the article continues.

The report highlights the role played by the popular social network’s algorithms: a test account created by the business daily was “inundated with content that sexualizes children” after clicking on a few such recommendations.

” Work group “

“Child exploitation is a horrifying crime,” said Meta. “We are working hard to fight this scourge and rid our platforms of it, and help law enforcement arrest and prosecute the criminals responsible.” The social media giant says it has created an “internal task force” to investigate the report’s findings and respond accordingly. He also recalls many measures already in place, such as the recruitment of specialists, to fight against “predators who constantly change tactics”.

“Between 2020 and 2022, dedicated teams took down 27 networks that were committing abuse and in January 2023 we disabled more than 490,000 accounts that violated our child safety rules,” said a spokesperson. He added that technical issues with reporting problematic profiles and content have been fixed and “thousands of additional terms and keywords have been restricted on Instagram.”

In all, by the end of 2022, more than 34 million content relating to the sexual exploitation of minors had been removed from Facebook and Instagram, of which more than 98% were detected before being reported by users, according to Meta.

Last March, pension and investment funds filed a complaint against the company for having “turned a blind eye” to human trafficking and pedocrime on its platforms. Instagram is also regularly accused by associations and authorities of not sufficiently protecting children against the risks of harassment, addiction and self-image problems.

#Instagram #algorithms #accused #facilitating #sale #child #pornography

You may also like

Leave a Comment