While the Times found some problems with the study reviewed in the article, it highlights the impact that YouTube had - and still has, despite the introduced measures - on the spread of conspiracy theories.
Climate change is a hoax, the Bible predicted President Trump’s election and Elon Musk is a devil worshiper trying to take over the world.
All of these fictions have found life on YouTube, the world’s largest video site, in part because YouTube’s own recommendations steered people their way.
For years it has been a highly effective megaphone for conspiracy theorists, and YouTube, owned and run by Google, has admitted as much. In January 2019, YouTube said it would limit the spread of videos “that could misinform users in harmful ways.”
One year later, YouTube recommends conspiracy theories far less than before. But its progress has been uneven and it continues to advance certain types of fabrications, according to a new study from researchers at University of California, Berkeley.