Sunday, October 1, 2023

Wall Street Journal Exposes Instagram Pedophile Network

'Why Instagram can’t disable the pedophile network isn’t clear. As the report noted, the platform was quick to purge all accounts questioning the 2020 presidential election...'

(Ken Silva, Headline USA) The Wall Street Journal published a 2,700-word article Wednesday that exposes a vast pedophile network operating on Instagram.

Partnering with researchers from Stanford University and the University of Massachusetts Amherst, the shocking WSJ report found that Instagram’s algorithms allegedly promote child pornography and connect pedophiles with each other.

“Though out of sight for most on the platform, the sexualized accounts on Instagram are brazen about their interest. The researchers found that Instagram enabled people to search explicit hashtags such as #pedowhore and #preteensex and connected them to accounts that used the terms to advertise child-sex material for sale,” the report said.

“Such accounts often claim to be run by the children themselves and use overtly sexual handles incorporating words such as ‘little slut for you.’”

The article also noted that some 5 million child pornography reports from Instagram were made in 2022—suggesting that the problem is out of control.

In response to the report, a spokesperson for Instagram parent company Meta said the company “plans to address such inappropriate recommendations as part of its newly formed child safety task force.”

“A Meta spokesman acknowledged that Meta had received the reports and failed to act on them. A review of how the company handled reports of child sex abuse found that a software glitch was preventing a substantial portion of user reports from being processed, and that the company’s moderation staff wasn’t properly enforcing the platform’s rules, the spokesman said,” WSJ reported.

“The company said it has since fixed the bug in its reporting system and is providing new training to its content moderators.”

Instagram also reportedly removed the option for users to view search results for terms likely to produce illegal images. The company declined to say why it had offered the option, according to the WSJ report.

Instagram further said that it blocked thousands of hashtags that sexualize children, restricted its systems from recommending users search for terms known to be associated with sex abuse and took down 490,000 accounts for violating its child safety policies in January alone.

But the WSJ report said the problem persisted even after notifying the company.

“Following the company’s initial sweep of accounts brought to its attention by Stanford and the Journal, UMass’s Levine checked in on some of the remaining underage seller accounts on Instagram,” the article said.

“As before, viewing even one of them led Instagram to recommend new ones. Instagram’s suggestions were helping to rebuild the network that the platform’s own safety staff was in the middle of trying to dismantle.”

Why Instagram can’t disable the pedophile network isn’t clear. As the report noted, the platform was quick to purge all accounts questioning the 2020 presidential election.

“In theory, this same tightness of the pedophile community on Instagram should make it easier for Instagram to map out the network and take steps to combat it,” WSJ said.

“Documents previously reviewed by the Journal show that Meta has done this sort of work in the past to suppress account networks it deems harmful, such as with accounts promoting election delegitimization in the U.S. after the Jan. 6 Capitol riot.”

Given Instagram’s inability or unwillingness to quash the problem, the WSJ report ended by quoting a researcher who suggested that the entire platform should be shut down.

“Pull the emergency brake,” said Brian Levine, director of the UMass Rescue Lab, which researches online child victimization and builds forensic tools to combat it.

“Are the economic benefits worth the harms to these children?”

Ken Silva is a staff writer at Headline USA. Follow him at twitter.com/jd_cashless.

Copyright 2023. No part of this site may be reproduced in whole or in part in any manner without the permission of the copyright owner. To inquire about licensing content, use the contact form at https://headlineusa.com/advertising.
- Advertisement -