YouTube To Ban All Anti-Vaccine Misinformation on Platform
YouTube will remove all videos spreading vaccine misinformation to crack down on harmful content disseminated during the COVID-19 pandemic.
Recently, the video-sharing platform wrote in a blog post that it will remove misinformative content that claims that approved vaccines pose danger and chronic health effects as well as do not lessen the chances of contracting diseases. Content that contains misleading data about substances in vaccines such as the idea that there are trackers in vaccines and that vaccines can cause autism, cancer or infertility will be taken down as well.
YouTube's policies do not just concern certain immunisations, but vaccines in general. The platform's Community Guidelines prohibit specific types of medical misinformation, with YouTube banning content that promotes dangerous remedies such as drinking turpentine to cure illnesses. In total, 10 new policies on COVID-19 and medical misinformation were made.
"At the onset of COVID-19, we built on these policies when the pandemic hit, and worked with experts to develop 10 new policies around COVID-19 and medical misinformation. Since last year, we've removed over 130,000 videos for violating our COVID-19 vaccine policies," YouTube said.
When it comes to COVID-19 guidelines, the company talked with local and international health organisations and experts to create these policies. For instance, YouTube's guidance on vaccine side effects maps to public vaccine resources given by health authorities and supported by medical consensus. The aforementioned policy changes will take effect starting today.
However, there will be exceptions to YouTube's guidelines as content on vaccine policies, vaccine trials as well as vaccine successes or failures throughout history will still be allowed on the platform. Vaccine testimonials will be permitted as long as the video does not go against other YouTube Community Guidelines, or the channel doesn't have a record of promoting vaccine hesitancy.
Other platforms have also addressed vaccine misinformation. Facebook stated that since the start of the pandemic, more than 3,000 accounts, pages and groups were deleted from the platform as they violated the site's COVID-19 and vaccine misinformation rules on many occasions. Over 20 million pieces of content were removed for breaking these rules.
It is similarly worth noting that Facebook's most viewed link in the U.S. during the first three months of 2021 was a news story suggesting that the COVID-19 vaccine played a role in a doctor's death. The company delayed the release of a report containing that information and published a similar report two days after the New York Times spoke of the shelved document's existence. Andy Stone, Facebook Policy Communications Director, said that the company chose not to release it earlier as there were important fixes to the system the social media giant wanted to implement.
Meanwhile, Twitter began adding labels to misleading tweets on COVID-19 vaccines beginning 1 March 2021. Ever since the platform introduced its COVID-19 guidance to users, over 8,400 tweets were taken down and 11.5 million accounts around the world were "challenged".
Aside from that, a strike system has been implemented, which imposes corresponding penalties on users who violated the COVID-19 policy. Users who had one strike will not receive any account-level punishment. Those who recorded two to three strikes will have their account locked for 12 hours, while those with four strikes will experience a seven-day account lock. Receiving five or more strikes will result in permanent suspension.
Written by Sophia Lopez