WASHINGTON — YouTube will stop removing content that falsely claims the 2020 election or other past U.S. presidential elections were marred by “widespread fraud, errors or glitches," the platform announced Friday.
The change is a reversal for the Google-owned video service, which said a month after the 2020 election that it would start removing new posts that falsely claimed widespread voter fraud or errors changed the outcome.
YouTube said in a blog post that the updated policy was an attempt to protect the ability to “openly debate political ideas, even those that are controversial or based on disproven assumptions.”
“In the current environment, we find that while removing this content does curb some misinformation, it could also have the unintended effect of curtailing political speech without meaningfully reducing the risk of violence or other real-world harm,” the blog post said.
The updated policy, which goes into effect immediately, won’t stop YouTube from taking down content that tries to deceive voters in the upcoming 2024 election, or other future races in the U.S. and abroad. The company said its other existing rules against election misinformation remain unchanged.
This could prove difficult to enforce, said John Wihbey, an associate professor at Northeastern University who studies social media and misinformation.
“It doesn’t take a genius if you’re on the disinformation ‘we were wronged in 2020’ side to say, 'wait a minute, let’s just claim that voting just generally is not worth it. And 2020 is our example,” he said. “I don’t know how you disentangle rhetoric that both refers to past wrongs and to forward possibilities. The content moderation team, which is going to try to do this, is going to tie themselves in knots trying to figure out exactly where that line is.”
The announcement comes after YouTube and other major social media companies, including Twitter and the Meta-owned Facebook and Instagram, have come under fire in recent years for not doing more to combat the firehose of election misinformation and disinformation that spreads on their platforms.
The left-leaning media watchdog group Media Matters said the policy change is not a surprise, as it was one of the “last major social media platforms” to keep the policy in place.
“YouTube and the other platforms that preceded it in weakening their election misinformation policies, like Facebook, have made it clear that one attempted insurrection wasn’t enough. They’re setting the stage for an encore," said its vice president Julie Millican in a statement.