The statute provides protections for online platforms that rely upon the aggregation of user-generated content, defining the role of the platforms as “publishers,” or mere conduits of individual free speech rather than content creators.
It also grants companies the ability to remove obscene or otherwise inappropriate content using their good-faith judgment.
“That concept of ‘good faith’ is what’s being challenged by many of you today,” he told the senators.
“Some of you don’t trust that we’re acting in good faith,” he continued. “It’s the problem I want to focus on solving.”
While some media companies nominally partner with a token conservative fact-checker, many cherry-pick information that fits their own leftist agenda in order to justify the draconian censorship of reasonable conservative viewpoints and facts.
This became especially clear after all three tech companies actively suppressed a bombshell New York Post story about the damning material found on Hunter Biden’s abandoned laptop.
Twitter claimed it was blocking the story due to a policy over hacked material—although it had never applied such a policy to stories that were harmful to Trump, even when the New York Times published Trump’s illicitly obtained tax records.
Facebook claimed it was slowing the story’s spread because it had been unable to verify it, and then proceeded to throttle it indefinitely.
Meanwhile, the Biden campaign used the platforms’ decision to suppress the story as evidence that it was false, prompting mainstream media outlets whose job it was to verify the facts instead to run with the reports that it was “unverified” and dismiss it entirely.
Nonetheless, all three tech companies insisted Wednesday that they weren’t intentionally playing favorites with the way information was treated based on users’ political preferences.
“The reality is that people have very different ideas and views about where the line should be,” Facebook CEO Mark Zuckerberg claimed in his opening statement.
“Democrats often say that we don’t remove enough content, and Republicans often say that we remove too much,” he continued. “I expect that we’ll hear some of those criticisms today—and the fact that both sides criticize doesn’t mean that we’re getting this right, but it does mean that there are real disagreements about where the limits of online speech should be.”
Zuckerberg said the increasingly polarized priorities of the Right and Left have turned into a balancing act for the platforms as they are left to their own judgments and devices.
“Sometimes the best approach from a safety or security perspective isn’t the best for privacy or free expression,” he said. “So we work with experts across society to try and strike the right balance. We don’t always get it right, but we try to be fair and consistent.”
Despite Zuckerberg’s claims, Facebook has been criticized repeatedly for its heavy reliance on non-objective fact-checkers.
Recent reports revealed that the company had hired Anya Adeola, a Russian-born former Joe Biden adviser with links to George Soros’ Open Society Foundations, as its election-integrity watchdog.
Moreover, 18 of the 20 members of Facebook’s recently established “oversight” board, responsible for determining what content gets censored, have links to Soros and other far-left institutions, according to sources including the accountability watchdog Judicial Watch.
And at least half a dozen Chinese visa holders are on the team of programmers responsible for writing the platform’s “hate speech” algorithms, according to the New York Post.
Zuckerberg insisted during the hearing that Facebook’s reason for slow-walking the Hunter Biden story was that the FBI had warned the company about immediate concerns of Russian disinformation.
But it wasn’t the only recent embarrassment the company faced. Shortly thereafter, it was called out for censoring a satirical story by the conservative Babylon Bee and demonetizing the publication’s other content.
The story—borrowing from a famous Monty Python scene—mocked Sen. Mazie Hirono’s specious accusations during the confirmation hearings for Supreme Court Justice Amy Coney Barrett.
It was first flagged by an algorithm but also failed a human review that claimed it incited violence.
Facebook eventually apologized after the censorship effort went public.
But as Sen. John Thune, R-SD, observed during Wednesday’s hearing, the companies’ patterns of bad behavior consistently seemed to belie their declared good intentions.
“Even if your actions aren’t skewed, they are hugely consequential for our public debate, yet you operate with limited accountability,” Thune said.
“Such distrust is intensified by the fact that the moderation practices used to suppress or amplify content remain largely a black box to the public,” he added. “Moreover, the public explanations given by the platforms for taking down or suppressing content too often seem like excuses that have to be walked back after scrutiny.”