YouTube’s anti-vaccine content controversy: study shows no proof found

Researchers spoke to participants of the study trained by the World Health Organisation to specifically look for an anti-vaccine video on YouTube with as few clicks as possible. They were asked to begin their search from the first informational Covid-19 video uploaded by WHO on the video-streaming platform.

By
  • Storyboard18,
| September 18, 2023 , 1:36 pm
The average share of anti-vaccine or vaccine hesitancy videos remained below 6 percent at all steps in users' recommendation trajectories,” said Margaret Yee Man Ng, an Illinois journalism professor and lead author of the study. (Representative image via Unsplash)
The average share of anti-vaccine or vaccine hesitancy videos remained below 6 percent at all steps in users' recommendation trajectories,” said Margaret Yee Man Ng, an Illinois journalism professor and lead author of the study. (Representative image via Unsplash)

A study published in the Journal of Medical Internet Research had earlier done research on whether YouTube’s recommendation system acted as a rabbit hole that led users to search for Covid-19 vaccine related content as well as anti-vaccine related videos. However, researchers found no evidence that pointed to YouTube being accused of promoting anti-vaccine content during the Coronavirus pandemic.

Researchers spoke to participants of the study trained by the World Health Organisation to specifically look for an anti-vaccine video on YouTube with as few clicks as possible. They were asked to begin their search from the first informational Covid-19 video uploaded by WHO on the video-streaming platform.

The group of researchers analysed over 27000 videos recommended by YouTube, using machine learning methods to segregate anti-vaccine content. Margaret Yee Man Ng, a journalism professor in Illinois and lead author of the study said, “We found no evidence that YouTube promotes anti-vaccine content to its users. The average share of anti-vaccine or vaccine hesitancy videos remained below 6 percent at all steps in users’ recommendation trajectories,” said Margaret Yee Man Ng, an Illinois journalism professor and lead author of the study.

Initially, researches were only trying to figure out YouTube’s content recommendations technique and if this technique drives users towards anti-vaccine content. UN Global Pulse researcher Katherine Hoffmann Pham said, “We wanted to learn about how different entities were using the platform to disseminate their content so that we could develop recommendations for how YouTube could do a better job of not pushing misinformation.”

“Contrary to public belief, YouTube wasn’t promoting anti-vaccine content. The study reveals that YouTube’s algorithms instead recommended other health-related content that was not explicitly related to vaccination,” Pham added.

Leave a comment

Your email address will not be published. Required fields are marked *