YouTube Will Delete New Videos With False Election Claims
YouTube plans to remove videos uploaded that attempt to mislead on the outcome of the 2020 US Presidential election, warning users that it will remove content that alleges unfounded voter fraud. The announcement comes amid ongoing criticism of the video platform for what some critics have bemoaned is a hands-off attitude toward misleading or false video, or content that could potentially incite violence.
It's been a particular challenge over the US election season. Widespread allegations of "fake news" and claims – unproven by either federal investigations or in the courts – that voter fraud, voting machine errors, or other factors have proliferated, with some supporters of President Trump apparently refusing to accept that President-elect Joe Biden won the election.
"Our Community Guidelines prohibit spam, scams, or other manipulated media, coordinated influence operations, and any content that seeks to incite violence," YouTube pointed out today. "Since September, we've terminated over 8000 channels and thousands of harmful and misleading elections-related videos for violating our existing policies. Over 77% of those removed videos were taken down before they had 100 views."
Now, though, YouTube says it will take an even heavier approach to that content moderation policy. "Yesterday was the safe harbor deadline for the U.S. Presidential election and enough states have certified their election results to determine a President-elect," the Alphabet-owned company points out. "Given that, we will start removing any piece of content uploaded today (or anytime after) that misleads people by alleging that widespread fraud or errors changed the outcome of the 2020 U.S. Presidential election, in line with our approach towards historical U.S. Presidential elections."
It means that videos uploaded to YouTube which allege voter machine fraud or software glitches took place, or that claim counting errors unfairly tipped the balance, will not be permitted and will be removed.
From today, YouTube will begin showing a 2020 Electoral College Results page on its election information panel. That will highlight the fact that US states have begun to certify the votes for President-elect Biden.
As for recommendations – which have been accused of skewing toward the hyperbolic and embracing outlandish conspiracy theories in an attempt to keep viewers engaged – YouTube argues that it's not quite the problem critics say it can be. "Over 70% of recommendations on election-related topics came from authoritative news sources," the company says, "and the top recommended videos and channels for election-related content were primarily authoritative news."
Still, it's also looking into how it can better handle misleading content of that sort. That includes videos which, though perhaps not highly recommended on YouTube itself, "continue to get high views, sometimes coming from other sites."
What YouTube doesn't appear to be doing is applying its policy retroactively. While it will be removing content uploaded today or anytime after today that "misleads people" through allegations of fraud or other election misdeeds, it's unclear just how stringently that will apply to existing videos on the site.