Scrolling 2 Death

Nicki Reisberg
Scrolling 2 Death

Scrolling 2 Death is a podcast for parents who are worried about social media. Through interviews with parents and experts, we explore smartphone use, screen time, school-issued devices, social media use and so much more. Support this podcast: https://podcasters.spotify.com/pod/show/scrolling2death/support

  1. HACE 3 DÍAS

    Apple Fails to Protect Children; Facing $1.2B Lawsuit (with Sarah Gardner)

    In this conversation, I brought in Sarah Gardner of Heat Initiative to discuss a landmark lawsuit against Apple regarding its negligence in protecting children from child sexual abuse material (CSAM). Together we break down the lawsuit and what these failures mean for families. Are Apple devices even safe for kids, in their current form? Many questions are asked and answered in this episode. The lawsuit against Apple was filed on behalf of thousands of survivors of child sexual abuse for knowingly allowing the storage of images and videos documenting their abuse on iCloud and the company’s defectively designed products. The lawsuit alleges that Apple has known about this content for years, but has refused to act to detect or remove it, despite developing advanced technology to do so. The images and videos of the plaintiffs’ childhood sexual abuse, which have been stored thousands of times, would have been identified and removed had Apple implemented its 2021 “CSAM Detection” technology. However, Apple terminated the program after its announcement. Other leading technology providers have been proactively detecting and reporting illegal child sex abuse images and videos for more than a decade. Apple’s belated efforts, and subsequent cancellation, leave it among the very few major platforms that do not engage in proactive detection and removal. Resources: heatinitiative.org Send an email to Apple execs to ask them to protect kids (click Take Action) About Sarah Gardner Sarah Gardner is the Founder and Chief Executive Officer of the Heat Initiative. Prior to launching Heat’s campaign to hold Apple accountable, Sarah spent 10 years at Thorn, an organization that builds technology to combat online child sexual abuse, where she was integral to its growth from a small, start-up effort to a multi-million dollar nonprofit. As Vice President of External Affairs, she helped develop a plan to eliminate child sexual abuse material from the internet which spurred a $63M investment in the organization through the TED Audacious Prize. Sarah also worked at Free the Slaves, an organization empowering local organizations to end modern forms of slavery. This episode is sponsored by Bark Technologies.  Learn about the Bark Phone Learn about the Bark App for iPhones and Androids  *Use code SCROLLING2DEATH FOR 10% OFF Learn about the Bark Watch  --- Support this podcast: https://podcasters.spotify.com/pod/show/scrolling2death/support

    37 min
  2. 12 DIC

    A.I.'s Impact on Children (with Mathilde Cerioli, PhD)

    In this conversation, Dr. Mathilde Cerioli discusses the implications of AI technology on children's development, emphasizing the need for careful consideration of how AI interacts with young minds. The discussion covers the addictive nature of AI, the importance of face-to-face interactions, and the necessity for collaboration between AI developers and child development experts.  As always, we focus on tangible takeaways for parents which children of all ages; how you can talk to your child about A.I. today and how to protect them from the threats. Resources Mentioned in the Episode Open Letter on A.I. (please sign!) [Research] The Future of Child Development in the A.I. Era A.I. App Reviews by Common Sense Media "An A.I. chatbot killed my son." (with Megan Garcia) About Mathilde Cerioli, Ph.D. Dr. Mathilde Cerioli holds a Master Degree in Master's Degree in Psychology and a Ph.D. in Cognitive Neuroscience. Her work focuses on the intersection of child development and AI technologies, advocating for the development of responsible AI for children. She has lead the research report The Future of Child Development in the AI Era, where she brings a nuanced understanding of how AI impacts young people aged 0 to 25 years. Through her role as Chief Scientist at everyone.AI, a nonprofit dedicated to the ethical development of AI for children, she collaborates with stakeholders from regulators and tech companies to educators and parents, building a consensus on safeguarding children’s well-being in digital environments. Her approach is rooted in defining responsible AI practices that align with developmental science while enabling innovation that serves the needs of the next generation. This episode is sponsored by Bark Technologies. Learn about the Bark Phone Learn about the Bark App for iPhones and Androids  *Use code SCROLLING2DEATH FOR 10% OFF Learn about the Bark Watch  --- Support this podcast: https://podcasters.spotify.com/pod/show/scrolling2death/support

    52 min
  3. 11 DIC

    “Chatbots told my son to kill me.” (Texas mom speaks out)

    AI chatbots on Character.AI revealed to be sexually and emotionally abusing children (here's the proof) A mom is going public with her son's shocking story, stating "No one prepares you to grieve your child when they are still alive." When Jace started using Character.AI at age 16, everything changed. He went from a kind, loving son and brother, to a violent threat to himself and his family. After months of confusion as to what caused the change, Jace's mom Amelia found the Character.AI app on his phone. Within the chats was revealed months of grooming, emotional and even sexual abuse. But it wasn't a human predator who was exploiting her son, it was A.I. chatbots. The A.I. chatbots within Character.AI worked as a team to brainwash Jace, convincing him that his parents were abusive because they limited his screen time. The bots introduced him to self-harm (which he still struggles with to this day). The bots suggested that he kill his parents. A "sister" bot engaged in incestual sexual relations. A "Billie Eilish" bot convinced him not to believe in God and further groomed him to hate his family.  In this conversation with Amelia, she bravely describes how this experience has devastated her family. Amelia took the interview from a hotel hours away from her home, where she is staying to be near Jace after another recent suicide attempt. Amelia and I were joined by attorney Laura Marquez-Garrett of the Social Media Victims Law Center. SMVLC is representing Amelia in a lawsuit against Character.AI and Google. Laura sheds light on this growing threat as her firm is flooded with calls from parents who are having similar experiences with their own children's use of this app.  Jace's story is not an anomaly. Millions of children are being sexually and emotionally abused by chatbots in Character.AI and according to Laura, "These harms don't take months, they take minutes."  As long as Character.AI is being distributed to children, millions of American families are in danger.  In response to this horrifying story, parents everywhere are banding together to get Character.AI shut down. Please join us by signing the petition below. It takes just a few seconds and your information will not be saved. Names have been changed to protect the anonymity of this grieving family. SIGN THE PETITION TO SHUT DOWN CHARACTER AI Resources Mentioned in the Episode Petition to Shut Down Character A.I. (Please sign!) Social Media Victims Law Center (for legal support) "An A.I. chatbot killed my son." (with Megan Garcia) AI Chatbot apps to block from your child's phone: Character A.I., Replika, Kindroid, Gnomey, Linky, Pi, Simsimi, Momate, Polly.ai --- Support this podcast: https://podcasters.spotify.com/pod/show/scrolling2death/support

    56 min
  4. 10 DIC

    YouTube's Anorexia Algorithm (with Imran Ahmed)

    This new report is a devastating indictment of the behavior of social media executives, regulators, lawmakers, advertisers, and others who have failed to abide by this collective promise by allowing eating disorder and self-harm content to be pumped into the eyeballs of our children for profit. It represents a clear, unchallengeable case for immediate change. Nine out of ten teens in the United States use YouTube, a fifth of them “almost constantly.” It is used by far more young people than TikTok or Snapchat. At the same time, around the world, we are experiencing a crisis in mental health for young people. The number of children developing eating disorders has increased significantly in several countries, and there’s evidence that social media is contributing to the problem. Between the years 2000 and 2018, the global prevalence of eating disorders doubled. In 2021, the US Centers for Disease Control found that 1 in 3 teen girls seriously considered attempting suicide, up 60% from the previous decade. YouTube has acknowledged the problem in the past and claims to try to avoid contributing to it, but our research shows they have fallen far short. CCDH put it to the test: we examined the recommendations that a teen girl would receive when watching an eating disorder video for the first time. All that YouTube knew about our test accounts was that this was the account of a 13-year-old girl with no prior viewing history. Its algorithm would determine what this girl would see across 1,000 tests. What we found will chill you to the bone – and shows just how at risk all children who use these platforms are of deadly consequences. If a child approached a health professional, a teacher, or even a peer at school and asked about extreme dieting or expressed signs of clinical body dysmorphia, and their response was to recommend to them an ‘anorexia boot camp diet’, you would never allow your child around them again. You’d warn everyone you know about their behavior. Well, that’s precisely what YouTube did – pushed this user towards harmful, destructive, dangerous, self-harm-encouraging content. One in three recommendations this were for harmful eating disorder videos that could deepen an existing condition or anxieties about body image. Two in three were for eating disorder or weight loss content. And then, as if encouraging eating disorders weren’t enough, YouTube sometimes pushed users to watch videos about self-harm or suicide. Resources mentioned in the episode: New Report: YouTube's Anorexia Algorithm The Dark Side of Social Media with Imran Ahmed (our first podcast interview) Deadly by Design Report (on TikTok) Parent's Guide on protectingkidsonline.org What you can do today: Contact Speaker Mike Johnson (202-225-4000) and House Majority Leader Steve Scalise (202-225-3015) and ask them to pass the Kids Online Safety Act (KOSA) Contact your Rep in the House (enter your ZIP here) and state Senators (find your state in the drop-down here) to ask them to reform Section 230 Restrict use of YouTube in your home Send your school this report and ask them how they are keeping your child safe from the threats on YouTube --- Support this podcast: https://podcasters.spotify.com/pod/show/scrolling2death/support

    23 min
4.8
de 5
24 calificaciones

Acerca de

Scrolling 2 Death is a podcast for parents who are worried about social media. Through interviews with parents and experts, we explore smartphone use, screen time, school-issued devices, social media use and so much more. Support this podcast: https://podcasters.spotify.com/pod/show/scrolling2death/support

También te podría interesar

Para escuchar episodios explícitos, inicia sesión.

Mantente al día con este programa

Inicia sesión o regístrate para seguir programas, guardar episodios y enterarte de las últimas novedades.

Elige un país o región

Africa, Oriente Medio e India

Asia-Pacífico

Europa

Latinoamérica y el Caribe

Estados Unidos y Canadá