(Dmytro “Henry” Aleksandrov, Headline USA) Even though Meta claims to have tried to get rid of pedophiles from its platforms, it was discovered that both Facebook and Instagram still struggle to prevent their systems from enabling or even outright promoting a vast network of pedophile accounts.
After the Wall Street Journal and researchers at Stanford University and the University of Massachusetts Amherst revealed in June of this year that Instagram’s algorithms connected a web of accounts devoted to the creation, purchasing and trading of underage sex content, Meta finally set up a child safety task force, the news source reported.
However, the Journal and the Canadian Centre for Child Protection conducted tests that revealed five months later that Meta’s recommendation systems still promote such content. Even though the company took down hashtags related to pedophilia, its systems sometimes still recommend new ones with minor variations.
The conducted tests also indicated that the problem extends beyond Instagram, with Facebook allowing large groups to explicitly be centered on the sexualization of children.
The Journal was informed by a Meta spokesman that the company was able to hide 190,000 groups in Facebook’s search results and disable tens of thousands of other accounts, adding, however, that the work hadn’t progressed as quickly as the company would’ve liked.
“Child exploitation is a horrific crime and online predators are determined criminals. We are actively continuing to implement changes identified by the task force we set up earlier this year,” the spokesman said, adding that the company plans to collaborate with other platforms seeking to get rid of pedophiles.
In its September 2023 report, the Stanford Internet Observatory, which has been examining internet platforms’ handling of child-sex content, said that Meta had some progress when it comes to the elimination of pedophiles. The organization also said that “the overall ecosystem [of pedophiles on Instagram] remains active,” adding that there is still “significant room for improvement in content enforcement.”
The Canadian Centre for Child Protection, a nonprofit that builds automated screening tools that are meant to protect children, said that a network of Instagram accounts with as many as 10 million followers each continued to livestream videos of child sex abuse months after it was reported to the company.
The Journal tests also showed that Facebook’s algorithms have helped build large Facebook Groups that were devoted to trading child sexual abuse content.