YouTube is moving to block and remove all content that spreads misinformation about vaccines against COVID-19 and other illnesses, such as measles, hepatitis and chicken pox.
The Google-owned online video company said in a blog post on Wednesday that any content that “falsely alleges that approved vaccines are dangerous and cause chronic health effects” will be removed.
“This would include content that falsely says that approved vaccines cause autism, cancer or infertility, or that substances in vaccines can track those who receive them.”
Since 2020, Google says it has taken down 130,000 videos for violating the company’s COVID-19 vaccine policies, and says it is stepping up those efforts.
“We’re expanding our medical misinformation policies on YouTube with new guidelines on currently administered vaccines that are approved and confirmed to be safe and effective by local health authorities and the WHO,” the company said.
More to come