Quantcast
Thursday, November 21, 2024

Analysis: Kiddie Porn Viewed 10M Times on Pre-Musk Twitter

‘... more than 95% of several active accounts exploiting CSAM... ‘acted with impunity for years.’...’

(Dmytro “Henry” Aleksandrov, Headline USA) While working alongside top officials at Musk’s Twitter, an independent cybersecurity data analyst found that accounts posting content that sexually exploits children garnered more than 10 million views under previous owners of Twitter, according to the Daily Wire.

The founder of cybersecurity group Ghost Data, Andrea Stroppa personally funded research last summer that eradicated more than 500 accounts soliciting Child Sexual Abuse Material [CSAM].

That led approximately to 30 major advertisers pulling or pausing their ad services on their Twitter accounts.

His new report that was released on Friday found that more than 95% of several active accounts exploiting CSAM, which included videos of children and teens involved in sexual activities, “acted with impunity for years.”

“The more we work on child sexual abuse material on Twitter, the more we find the HELL left by the old Twitter management,” Stroppa wrote in his Twitter thread.

Even though Stroppa said that those accounts have been taken down, Twitter’s CSAM problem has been documented for more than a decade, which means that this content was on the platform all of that time.

Twitter said last summer that the company aggressively fights online child sexual abuse. However, a Twitter Red Team report from an Adult Content Monetization project discovered that although the social media platform invested in technologies to detect and manage the issue, “growth has not.”

“Twitter cannot accurately detect child sexual exploitation and non-consensual nudity at scale,” the Red Team said in Apr. 2022, The Verge reported.

“Last year, 86,666 CSAM reports on Twitter were made to the National Center for Missing and Exploited Children, which plays a key role after Internet sites — like Twitter — remove the material,” the Wire added.

“When CSAM is detected and removed, the report goes to the center, which contacts law enforcement for further action.”

It got to the point when the previous Twitter owners refused to remove the videos of minors being sexually exploited which garnered more than 160,000 views and over 2,000 retweets from the platform, even though one of the minor survivors was on the brink of suicide.

“I found that old Twitter, in many cases related to child sexual abuse material published on the platform by malicious users, instead of suspending the whole account, just removed the tweet,” Stroppa said.

“Now Twitter, under Elon Musk’s leadership, directly suspends the accounts.”

When Musk previously pointed out that Twitter has a problem with child pornography, Jack Dorsey, ex-CEO of Twitter denied it.

Copyright 2024. No part of this site may be reproduced in whole or in part in any manner other than RSS without the permission of the copyright owner. Distribution via RSS is subject to our RSS Terms of Service and is strictly enforced. To inquire about licensing our content, use the contact form at https://headlineusa.com/advertising.
- Advertisement -

TRENDING NOW

TRENDING NOW