YouTube announced that they have banned false content regarding vaccines—including content that alleges approved vaccines are dangerous and other misinformation. They have also expanded their medical misinformation policies on YouTube with new guidelines on vaccines.
“Working closely with health authorities, we looked to balance our commitment to an open platform with the need to remove egregious harmful content. We’ve steadily seen false claims about the coronavirus vaccines spill over into misinformation about vaccines in general, and we’re now at a point where it’s more important than ever to expand the work we started with COVID-19 to other vaccines,” wrote YouTube in their blog.
Last year, YouTube claims that they have removed “over 200,000 videos related to dangerous or misleading COVID-19 information”. However, they only removed videos which specifically mentioned vaccines in the context of false claims that one “is available or that there’s a guaranteed cure”.
This time, the video platform takes a tougher stance on all vaccine-related misinformation. They specified that they will be banning content that falsely alleges that approved vaccines are dangerous and cause chronic health effects, claims that vaccines do not reduce transmission or contraction of disease, or contains misinformation on the substances contained in vaccines.
This would include content that falsely says that approved vaccines cause autism, cancer or infertility, or that substances in vaccines can track those who receive them. Their new policies even cover other types of vaccines like for measles or Hepatitis B.
“If your content violates this policy, we’ll remove the content and send you an email to let you know. If this is your first time violating our Community Guidelines, you’ll likely get a warning with no penalty to your channel. If it’s not, we may issue a strike against your channel. If you get 3 strikes within 90 days, your channel will be terminated. You can learn more about our strikes system here,” said YouTube in their guideline page.
YouTube also adds that they will terminate your channel or account for repeated violations. They may also terminate your account after one case of severe abuse, or when a channel is dedicated to a policy violation. Users, however, are still allowed to share content related to their personal experiences with the vaccine—as long as those videos adhere to the site’s community guidelines and the channel doesn’t encourage “vaccine hesitancy”.
The new policies go into effect immediately. According to CNBC, YouTube has already removed pages known for sharing anti-vaccination sentiments. However, a widespread removal of these videos may take some time. NPR reports that many conservative pages that spread vaccine misinformation are still active on YouTube—and their videos continue to attract millions of views.