We have all seen It happened: when you watch a video on YouTube, your recommendation will change, as if Google’s algorithm believes that the subject of the video is the passion of your life. Suddenly, all the recommended videos you see (and possibly many ads) are related to the topic.
In most cases, the results are ridiculous.But there is already one A steady stream of stories About how this process makes people radical, sends them deeper and deeper into the rabbit hole, until all their opinions are dominated by marginal thoughts and conspiracy theories.
A new study released on Monday looks at whether these stories represent a larger trend or just a collection of anecdotes. Although the data cannot rule out the existence of online radicalization, it certainly shows that this is not the most common experience. On the contrary, fringe ideas seem to be only part of a larger self-reinforcing community.
Generally, the challenge of conducting such research is to obtain data about people’s video viewing habits without their knowledge and possibly change their behavior accordingly. Researchers solve this problem by obtaining data from Nielsen, which simply tracks what people are watching. People allow Nielsen to track their habits, and the company anonymizes the resulting data. In this study, the researchers collected data from more than 300,000 viewers who watched more than 21 million videos on YouTube from 2016 to the end of 2019.
Most of these videos have nothing to do with politics, so the author used the literature to identify a large number of channels marked by previous research based on their political leanings, from the extreme left to the centrist to the extreme right. To this list, the researchers added a category they called “anti-wakeup”. Although they are not always open politics, more and more channels are focused on “opposing progressive social justice movements.” Although these channels tend to align with right-wing interests, the moderator of the video usually does not express these ideas in this way.
All in all, the channels classified by the researchers (less than 1,000) accounted for only 3.3% of the total number of video views during this period. People who watch them tend to stick to a single type of content; if you started watching left-leaning content in 2016, you will most likely still be watching when the study period ends in 2020. In fact, depending on the time spent on each video, you are likely to watch more of this kind of content in 2020, perhaps a product of the controversy in the Trump era.
(The one exception is ultra-left content, which is rarely viewed, so that in most cases it is impossible to find statistically significant trends.)
During this period, almost all types of non-marginal content have also seen growth, both in terms of total viewers and time spent watching videos on these channels (except for extreme left and extreme right content). This finding suggests that at least some trends reflect the increasing use of YouTube as an alternative to more traditional broadcast media.
Since most viewers watch a single type of content, it is easiest to think of them as forming different groups. The researchers tracked the number of people belonging to each group and the amount of time they watched the video over a four-year period.
During that time, the number of mainstream leftists was as large as the other groups combined. Close behind are the centrists. The era of mainstream right and counter-awakening started roughly the same as the extreme right. But they all show different trends. The total number of far-right viewers has remained the same, but their time spent watching videos is climbing. In contrast, the total number of mainstream right-wing viewers has risen, but their time spent watching is not much different from that of extreme right-wing viewers.
The growth rate of anti-arousal viewers is the highest among all groups. By the end of this period, they spend more time watching videos than the centrists, even though their population is still small.