Google-owned YouTube has announced to stop removing content that advances false claims that widespread fraud, errors, or glitches occurred in the 2020 and other past US presidential elections.
The platform instituted elections misinformation policy focused on the integrity of past US Presidential elections in December 2020.
Two years, tens of thousands of video removals, and one election cycle later, "we recognised it was time to reevaluate the effects of this policy in today's changed landscape", the company said in a statement.
In the current environment, "we find that while removing this content does curb some misinformation, it could also have the unintended effect of curtailing political speech without meaningfully reducing the risk of violence or other real-world harm".
YouTube said that the "ability to openly debate political ideas, even those that are controversial or based on disproven assumptions, is core to a functioning democratic society -- especially in the midst of election season".
"As with any update to our policies, we carefully deliberated this change," it added.
This specific aspect of our elections misinformation policy represents just one piece of a broad, holistic approach towards supporting elections on YouTube. Here's what isn't changing:
Following the 2020 election, YouTube found that videos from authoritative sources like news outlets represented the most viewed and most recommended election videos on YouTube.
"All of our election misinformation policies remain in place, including those that disallow content aiming to mislead voters about the time, place, means, or eligibility requirements for voting; false claims that could materially discourage voting, including those disputing the validity of voting by mail; and content that encourages others to interfere with democratic processes," said the company.
"We'll have more details to share about our approach towards the 2024 election in the months to come," it added.