Although YouTube could be an educational heaven it is still the place where conspiracy theories, right-wing ideologies, fake news and flat-earth videos are booming. The platform’s algorithm which recommends videos to its users – and which accounts for 70 percent of videos watched on the site, according to the Mozilla Foundation – is having fun taking viewers into its darkest corners. Like there’s a gloating demon at work, it even suggests watching climate change misinformation and then monetizes »that misinformation with ads for the World Wildlife Fund and Greenpeace« (read more on vice.com,January 16, 2020).
Beyond such anecdotal evidence researchers are also trying to get a more general picture of the algorithm’s misbehaviour. Marc Faddoul and Hany Farid from the University of California, Berkeley, and Guillaume Chaslot from the Mozilla Foundation undertook »A longitudinal analysis of YouTube’s promotion of conspiracy videos« to »better understand the nature and extent of YouTube’s promotion of conspiratorial content«. (Their paper is not peer-reviewed.) They wanted to know if YouTube’s efforts to reduce »borderline content and content that could misinform users in harmful ways« (YouTube’s Official Blog, January 25, 2019) yielded results. It did: »Our analysis corroborates that YouTube acted upon its policy and significantly reduced the overall volume of recommended conspiratorial content.« Farid, a specialist on digital forensics, image analysis, and human perception, is also an advisor of the international non-for-profit Counter Extremism Project.
In the meantime, there is a new Mozilla project underway, as engagdet.com reported on July 15, 2020: TheirTube illustrates what six personas – fictional online identities – are recommended by YouTube to watch. Here you can have an instructional look at the online life of a fruitarian, a doomsday prepper, a liberal, a conservative, a conspiracist and a climate denier. (tk)
Faddoul, M., Chaslot, G., Farid, H. (2020). A longitudinal analysis of YouTube’s promotion of conspiracy videos. Available online at https://arxiv.org/abs/2003.03318
YouTube Regrets: Stories about »bizarre« and »dangerous« recommendations collected by the Mozilla Foundation