NY Times Article about YouTube CT Study

JFDee

Senior Member.
While the Times found some problems with the study reviewed in the article, it highlights the impact that YouTube had - and still has, despite the introduced measures - on the spread of conspiracy theories.
Climate change is a hoax, the Bible predicted President Trump’s election and Elon Musk is a devil worshiper trying to take over the world.
All of these fictions have found life on YouTube, the world’s largest video site, in part because YouTube’s own recommendations steered people their way.

For years it has been a highly effective megaphone for conspiracy theorists, and YouTube, owned and run by Google, has admitted as much. In January 2019, YouTube said it would limit the spread of videos “that could misinform users in harmful ways.”

One year later, YouTube recommends conspiracy theories far less than before. But its progress has been uneven and it continues to advance certain types of fabrications, according to a new study from researchers at University of California, Berkeley.
Content from External Source

https://www.nytimes.com/interactive/2020/03/02/technology/youtube-conspiracy-theory.html
 
From the summary of the study: https://farid.berkeley.edu/downloads/publications/arxiv20.pdf

Aggregate data hide very different realities for individuals, and although radicalization is a serious issue, it is only relevant for a fraction of the users. Those with a history of watching conspiratorial content can certainly still experience YouTube as filter-bubble, reinforced by personalized recommendations and channel subscriptions. In general, radicalization is a more complex problem than what an analysis of default recommendations can scope, for it involves the unique mindset and viewing patterns of a user interacting over time with an opaque multi-layer neural network tasked to pick personalized suggestions from a dynamic and virtually infinite pool of ideas.
Content from External Source

It sounds like Youtube has a better understanding of how to keep people away from the rabbit hole than helping them escape it. You can lead a user to "recommended videos" but you can't make them click.
 
I have always found youtube recommendations annoying.
I find copious amounts of recommendations for CNN equally as spammy as recommendations for conspiracy videos.
I far prefer to just hit the search box and type in the word that best describes the type of video list i wish to view.
It may seem trivial, but if youtube stopped recommending certain video's and channels over others, and just allowed people to search the site for themselves, like one would do if they were in a Library or Video shop, then the issue of them recommending conspiracy videos would be dead and buried right there and then.
Though i do appreciate that Youtube are a company, and revenue and financial gain is of the upmost importance, and cutting back on selling techniques and allowing users to simply decide for themselves, could infringe upon maximum profits.
 
Back
Top