Danny Moloshok / AP
YouTube is tackling the spread of misinformation by banning misleading and inaccurate content about vaccines.
The platform announced the change in a blog post on Wednesday, stating that its current community guidelines, which already prohibit the disclosure of medical misinformation, have been extended to “currently administered” vaccines that have been proven safe by the World Health Organization and other health officials .
The website had previously banned content that made false claims about COVID-19 vaccines as part of its COVID-19 misinformation policy. The change extends this policy to a much larger number of vaccines.
“We have continually seen false claims about the coronavirus vaccines spreading into misinformation about vaccines in general, and we are now at a point where it is more important than ever to do the work we started with COVID-19 have to expand to other vaccines. ” said the company.
YouTube says it has already removed pages
YouTube said it is now banning videos claiming vaccines are not safe, effective, or cause other health problems such as cancer and infertility. In its announcement, the company highlighted videos that inaccurately describe what ingredients are used in vaccines, as well as claims that vaccines contain properties that can “track” those who receive them.
There are a few exceptions: users are still allowed to share content related to their personal experience with the vaccine, but only if those videos adhere to the website’s community guidelines and the channel in question does not routinely encourage “vaccine hesitation”.
The new mandate is effective immediately, and YouTube has already removed sites known for spreading anti-vaccination reports, as reported by prominent anti-vaccination campaigns Joseph Mercola, Erin Elizabeth, Sherri Tenpenny and Robert F. Kennedy Jr.’s Children’s Health Defense Organization CNBC.
The company says full enforcement will take time
However, the Google-owned company warned that the widespread removal of video could take some time as the policy can be enforced.
As big tech companies like YouTube and Facebook tightened their restrictions on vaccine misinformation last year, many conspiracy theorists began to migrate to other less regulated platforms. Another video-sharing site, Rumble, has become a popular choice for far-right groups and others resistant to vaccination, Slate reported in March.
But many conservative sites that spread misinformation about vaccines are still active on YouTube, and their videos continue to attract millions of views.
Publisher’s Note: Google is one of MediaFrolic’s financial backers.