YouTube's Anorexia Algorithm (with Imran Ahmed)

Scrolling 2 Death

This new report is a devastating indictment of the behavior of social media executives, regulators, lawmakers, advertisers, and others who have failed to abide by this collective promise by allowing eating disorder and self-harm content to be pumped into the eyeballs of our children for profit. It represents a clear, unchallengeable case for immediate change.

Nine out of ten teens in the United States use YouTube, a fifth of them “almost constantly.” It is used by far more young people than TikTok or Snapchat. At the same time, around the world, we are experiencing a crisis in mental health for young people. The number of children developing eating disorders has increased significantly in several countries, and there’s evidence that social media is contributing to the problem. Between the years 2000 and 2018, the global prevalence of eating disorders doubled. In 2021, the US Centers for Disease Control found that 1 in 3 teen girls seriously considered attempting suicide, up 60% from the previous decade.

YouTube has acknowledged the problem in the past and claims to try to avoid contributing to it, but our research shows they have fallen far short. CCDH put it to the test: we examined the recommendations that a teen girl would receive when watching an eating disorder video for the first time. All that YouTube knew about our test accounts was that this was the account of a 13-year-old girl with no prior viewing history. Its algorithm would determine what this girl would see across 1,000 tests. What we found will chill you to the bone – and shows just how at risk all children who use these platforms are of deadly consequences.

If a child approached a health professional, a teacher, or even a peer at school and asked about extreme dieting or expressed signs of clinical body dysmorphia, and their response was to recommend to them an ‘anorexia boot camp diet’, you would never allow your child around them again. You’d warn everyone you know about their behavior.

Well, that’s precisely what YouTube did – pushed this user towards harmful, destructive, dangerous, self-harm-encouraging content.

  • One in three recommendations this were for harmful eating disorder videos that could deepen an existing condition or anxieties about body image.
  • Two in three were for eating disorder or weight loss content.

And then, as if encouraging eating disorders weren’t enough, YouTube sometimes pushed users to watch videos about self-harm or suicide.

Resources mentioned in the episode:

  • New Report: YouTube's Anorexia Algorithm
  • The Dark Side of Social Media with Imran Ahmed (our first podcast interview)
  • Deadly by Design Report (on TikTok)
  • Parent's Guide on protectingkidsonline.org
  • What you can do today:

    • Contact Speaker Mike Johnson (202-225-4000) and House Majority Leader Steve Scalise (202-225-3015) and ask them to pass the Kids Online Safety Act (KOSA)
    • Contact your Rep in the House (enter your ZIP here) and state Senators (find your state in the drop-down here) to ask them to reform Section 230
    • Restrict use of YouTube in your home
    • Send your school this report and ask them how they are keeping your child safe from the threats on YouTube
  • --- Support this podcast: https://podcasters.spotify.com/pod/show/scrolling2death/support

    Pour écouter des épisodes au contenu explicite, connectez‑vous.

    Recevez les dernières actualités sur cette émission

    Connectez‑vous ou inscrivez‑vous pour suivre des émissions, enregistrer des épisodes et recevoir les dernières actualités.

    Choisissez un pays ou une région

    Afrique, Moyen‑Orient et Inde

    Asie‑Pacifique

    Europe

    Amérique latine et Caraïbes

    États‑Unis et Canada