TY - JOUR
T1 - Technologically scaffolded atypical cognition
T2 - The case of YouTube’s recommender system
AU - Alfano, Mark
AU - Fard, Amir Ebrahimi
AU - Carter, J. Adam
AU - Clutton, Peter
AU - Klein, Colin
PY - 2020
Y1 - 2020
N2 - YouTube has been implicated in the transformation of users into extremists and conspiracy theorists. The alleged mechanism for this radicalizing process is YouTube’s recommender system, which is optimized to amplify and promote clips that users are likely to watch through to the end. YouTube optimizes for watch-through for economic reasons: people who watch a video through to the end are likely to then watch the next recommended video as well, which means that more advertisements can be served to them. This is a seemingly innocuous design choice, but it has a troubling side-effect. Critics of YouTube have alleged that the recommender system tends to recommend extremist content and conspiracy theories, as such videos are especially likely to capture and keep users’ attention. To date, the problem of radicalization via the YouTube recommender system has been a matter of speculation. The current study represents the first systematic, pre-registered attempt to establish whether and to what extent the recommender system tends to promote such content. We begin by contextualizing our study in the framework of technological seduction. Next, we explain our methodology. After that, we present our results, which are consistent with the radicalization hypothesis. Finally, we discuss our findings, as well as directions for future research and recommendations for users, industry, and policy-makers.
AB - YouTube has been implicated in the transformation of users into extremists and conspiracy theorists. The alleged mechanism for this radicalizing process is YouTube’s recommender system, which is optimized to amplify and promote clips that users are likely to watch through to the end. YouTube optimizes for watch-through for economic reasons: people who watch a video through to the end are likely to then watch the next recommended video as well, which means that more advertisements can be served to them. This is a seemingly innocuous design choice, but it has a troubling side-effect. Critics of YouTube have alleged that the recommender system tends to recommend extremist content and conspiracy theories, as such videos are especially likely to capture and keep users’ attention. To date, the problem of radicalization via the YouTube recommender system has been a matter of speculation. The current study represents the first systematic, pre-registered attempt to establish whether and to what extent the recommender system tends to promote such content. We begin by contextualizing our study in the framework of technological seduction. Next, we explain our methodology. After that, we present our results, which are consistent with the radicalization hypothesis. Finally, we discuss our findings, as well as directions for future research and recommendations for users, industry, and policy-makers.
KW - Conspiracy theory
KW - Radicalization
KW - Recommender systems
KW - Technological seduction
KW - Transformative experience
KW - YouTube
UR - http://www.scopus.com/inward/record.url?scp=85086156437&partnerID=8YFLogxK
U2 - 10.1007/s11229-020-02724-x
DO - 10.1007/s11229-020-02724-x
M3 - Article
AN - SCOPUS:85086156437
SN - 0039-7857
VL - 199
SP - 835
EP - 858
JO - Synthese
JF - Synthese
IS - 1-2
ER -