A Counterintuitive Bias in YouTube’s Algorithm

By Chuck Dinerstein, MD, MBA — Aug 28, 2023
YouTube has two billion active monthly users and uploads 500 hours of content every minute. Twenty five percent of U.S. adults get their news from YouTube, and 60% of regular users “use the platform to keep up with current events.” Since roughly 70% of all videos watched come from YouTube’s algorithm, it's fair to ask if it might be biased. A new study says it is, and that the algorithm leans left.
Image by Gerd Altmann from Pixabay

The study was published in PNAS Nexus. Three hundred sixty bots simulated new YouTube users to isolate the onboarding and subsequent algorithmic changes. The 360 bots were divided into six political affiliations: Far Left, Left, Center, Anti-woke, Right and Far Right. The researchers varied the videos the bots watched using the top 20 algorithmic recommendations to draw some conclusions.

  • The bots first watched 30 videos matching their leanings – the recommendations measured how quickly the algorithm adjusted to their preferences.
  • The six political affiliations were then subdivided and given a new affiliation. After watching another 30 videos aligned with their new political outlook,  the top 20 algorithmic recommendations measured how quickly the algorithm adjusted to the new content of the bots
  • In the last study, the bots watched the recommended videos over multiple sessions – to measure where the algorithms lead.

Results

Before any videos were watched - Only 3% of the initial recommendations were “News and Politics.” Of those, 51% were classified as Center, 42% Left, and 6% Right. The Left-leaning videos were viewed far more frequently than any of the other political classes.

Building the bot's political persona - Most recommendations quickly match the bot’s political affiliation. Even after viewing one video, recommendations matched 78% for those on the Left and 70% for the Anti-woke. The political extremes were offered more of the Left or Right videos. The classification of videos also varied

  • Center and Right were offered more News and Politics videos.
  • The Left, more Entertainment and Comedy to “account for late-night television’s left-leaning bias.”
  • For the Anti-woke, more People and Blogs, presumably accounting for podcasts like that of Joe Rogan.

Escaping the now-established political persona – By watching videos of a political affiliation other than the one established. The speed of escape was the number of videos needed to be seen before the recommendations matched the new political persona was asymmetric.

  • It was more “difficult” to escape Center or Left, requiring 9-fold more videos than escaping Right or Far Right.
  • It was also almost impossible to enter Far Right or Left from any other political affiliation, requiring 29 videos to be viewed. It only needed two to move from the extremes inward.
  • Moving to the center or left was far easier than moving to the right.

Algorithmic Path - Researchers looked at how algorithmic recommendations changed after the attempt to escape the bot’s initial political affiliation. The strongest tendency was to return recommendations to those of the Center or Left, the weakest to the Far Left or Far Right – the algorithm returned to the “political mean” and away from the extremes.

Taken together, YouTube’s algorithmic recommendations sought to move and retain individuals in the center left of the political spectrum. In the words of the researchers, YouTube’s algorithmic recommendations

“are asymmetric with a skew towards the Left, both with respect to the speed by which one may enter a political persona, as well as the difficulty in escaping these filter bubbles.”

Other studies have suggested just the opposite, that the algorithms move people to the right of center. There is little evidence that the algorithms move people to the extremes or down informational rabbit holes. While we hear so much about rabbit holes and extremes, the truth lies elsewhere.

Can we rightly conclude that YouTube is indeed leaning left-center politically? While the data from this current study says yes, adding a bit more context is crucial. Using data from the University of Chicago’s National Opinion Research Center’s (NORC) [1] General Social Survey over the past 40 years

“The average American was more likely to identify as politically liberal at age 25 but more likely to be conservative 20 years later.”

  • 34 percent of 25-year-olds identified as liberal, compared with 25 percent of 45-year-olds and 20 percent of 75-year-olds.
  • 26 percent of 25-year-olds identified as conservative, compared with 36 percent of 45-year-olds and 41 percent of 75-year-olds.
  • Age 45 is the rough average of when most people have drifted far enough right to no longer be considered liberal.

When we marry those political affiliations with the age of YouTube users, a different bias appears.

In the words of Willie Sutton, a bank robber of the 1930s, when asked why he robbed banks, “I rob banks because that's where the money is.” The algorithms are not biased because of some insidious political affiliation. They are tuned to attention and monetization; for YouTube’s algorithms, the users and the money lie in the political left and center.

 

[1] A nonpartisan, nonprofit social science research and public policy organization

Source: YouTube’s recommendation algorithm is left-leaning in the United States PNAS Nexus DOI: 10.1093/pnasnexus/pgad264

Chuck Dinerstein, MD, MBA

Director of Medicine

Dr. Charles Dinerstein, M.D., MBA, FACS is Director of Medicine at the American Council on Science and Health. He has over 25 years of experience as a vascular surgeon.

Recent articles by this author: