This data visualization shows the strongly connected network of popular YouTube channels with a reactionary right ideology that follow and feature each other. We uncovered this online community during several months of research on how users are exposed to extremist content on YouTube and what role extremist vloggers and the YouTube recommendation algorithm play in the real-life radicalization of viewers.
We reveal the connections between these channels around themes like antisemitism, anti-feminism and white supremacy and show how this can easily lead viewers down a rabbit hole of increasingly extremist content, even if the recommendation algorithm itself is not biassed.
Unlike other research, we did not only look at the videos that the YouTube algorithm recommends, but also unearthed the underlying community of reactionary right channels that follow and feature each other. Moreover, we reconstructed the journey of users through this network using tens of millions of their comments and interviewed some of these people about the development of their views.
To uncover the network of reactionary right channels, we compiled a list of YouTube accounts that are considered to be extremist right by anti-fascism experts, academic researchers and various media sources. Using the ‘YouTube Data Tool’ of the Digital Methods Initiative we then collected all followers, subscriptions and featured channels of these accounts. We filtered the resulting collection of channels by hand and iterated this search procedure several times. Of the final 1500 channels we collected the videos, comments and other additional information from the YouTube API, using Python scripts. We also transcribed 400.000 videos using the youtube-dl Python library. Data on the monthly number of views and subscribers of the channels were obtained from Socialblade, which also provides a measure for the influence that popular channels have.
We analyzed these data with the help of statisticians, media scientists and algorithm experts, partially through two hackathon days in september and october. For our understanding, we also simply watched hundreds of the most popular videos we surfaced.
What was the hardest part of this project?
One of the hard parts was to identify the thematic character of the different channels. This involved watching many of these YouTube videos (and transcripts) and classifying the type of content. Also tracking down and interviewing radicalized users was difficult.
What can others learn from this project?
To sketch the full picture you need to consult with the data, experts and users alike.