On December 9, 2020, YouTube announced it would take down videos that alleged widespread fraudulent voting in the 2020 presidential election. A month later, after President Donald Trump's lies about his loss inflamed a mob that attacked the U.S. Capitol, YouTube strengthened those policies to prevent the spread of election-related misinformation.
What casual observers might not understand, however, is just how far the policy goes. Not only does YouTube punish channels that spread misinformation, but in many cases, it also punishes channels that report on the spread of misinformation. The platform makes no distinction between the speaker and the content creator. If a channel produces a straight-news video that merely shows Trump making an unfounded election-related claim—perhaps during a speech, in an interview, or at a rally—YouTube would punish the channel as if the channel had made the claim, even if no one affiliated with the channel endorsed Trump's lies.
I learned this firsthand on Thursday after YouTube suspended my show—Rising—for violating the election misinformation policy, despite the fact that neither my co-hosts nor I had said anything to indicate that we believe the election was rigged.