YouTube‘s content removal policy has been to ban first, review later. But that approach has led them to a situation in which they are reinstating more than half of appealed content, Reclaim the Net reported.
Google, which owns YouTube, said the increase in wrongful banning is happening because artificial intelligence powered by algorithms has taken on a larger role in content review.
However, some critics have long disputed such claims, noting that it was biased engineers who programmed the algorithms in the first place, oftentimes targeting conservative or right-wing content.
Algorithms trashed more than 11.4 million videos in three months this year, from April to June. This represents a 50 percent increase from the first three months of 2020.
Users appealed only 325,000 of these removed videos, and YouTube republished more than half of them after human review.
These errors cause delays in the transmission of information that is necessary for citizens to make informed decisions, especially health-related decisions.
Moreover, deplatformed content-creators on YouTube may lose advertising revenue during the period in which the video is removed.
“Each quarter, millions of videos that are first flagged by our automated systems are later evaluated by our human review team and determined not to violate our policies,” the YouTube team wrote in a blog post, titled “Responsible policy enforcement during COVID-19.
Google said YouTube has allowed algorithms to crush more content because the pandemic has prevented human employees from manually reviewing videos.
“When reckoning with greatly reduced human review capacity due to COVID-19, we were forced to make a choice between potential under-enforcement or potential over-enforcement,” the team wrote.
“Because responsibility is our top priority, we chose the latter — using technology to help with some of the work normally done by reviewers, with automated systems set to cast a wider net,” it added.