Scrolling 2 Death

Nicki Reisberg
Scrolling 2 Death

Scrolling 2 Death is a podcast for parents who are worried about social media. Through interviews with parents and experts, we explore smartphone use, screen time, school-issued devices, social media use and so much more. Support this podcast: https://podcasters.spotify.com/pod/show/scrolling2death/support

  1. 3일 전

    Australia's Under-16 Social Media Ban: Will it go global? (with Dani Elachi)

    I brought in Dany Elachi of the Australian-based Heads Up Alliance to learn about the recent U16 Age Gate Bill which passed in Australia. In addition to breaking down the new law, Dany shares his personal experience in (admittedly) giving his pre-teen daughter a phone too early and how correcting this mistake led to an advocacy movement across Australia. Dany breaks down the legislative battle in getting this new law passed, how the age verification will work, which platforms this applies to and much more.  Resources Mentioned in the Episode Scott Galloway Explains Why Age Gating Social Media Is Both Necessary and Doable Australia Passes Social Media Ban for Kids Under 16 Dany Elachi is a Sydney husband and father of five children, aged between 7 and 15. He is also the founder of the Heads Up Alliance, a volunteer-run, grassroots movement of Australian parents committed to delaying smartphones and social media for their children. Since its inception in 2020, the Heads Up Alliance has also been a powerful advocate for children's wellbeing. The movement has successfully advocated for policy changes that led to smartphones banned in nearly 1,000 Australian schools. More recently, the Heads Up Alliance played a pivotal role in lobbying for a world-leading national law raising the minimum age for social media to 16. The Heads Up Alliance remains committed to ending the phone-based childhood and reclaiming the play-based one for all Australian children. --- Support this podcast: https://podcasters.spotify.com/pod/show/scrolling2death/support

    38분
  2. 12월 19일

    Apple Fails to Protect Children; Facing $1.2B Lawsuit (with Sarah Gardner)

    In this conversation, I brought in Sarah Gardner of Heat Initiative to discuss a landmark lawsuit against Apple regarding its negligence in protecting children from child sexual abuse material (CSAM). Together we break down the lawsuit and what these failures mean for families. Are Apple devices even safe for kids, in their current form? Many questions are asked and answered in this episode. The lawsuit against Apple was filed on behalf of thousands of survivors of child sexual abuse for knowingly allowing the storage of images and videos documenting their abuse on iCloud and the company’s defectively designed products. The lawsuit alleges that Apple has known about this content for years, but has refused to act to detect or remove it, despite developing advanced technology to do so. The images and videos of the plaintiffs’ childhood sexual abuse, which have been stored thousands of times, would have been identified and removed had Apple implemented its 2021 “CSAM Detection” technology. However, Apple terminated the program after its announcement. Other leading technology providers have been proactively detecting and reporting illegal child sex abuse images and videos for more than a decade. Apple’s belated efforts, and subsequent cancellation, leave it among the very few major platforms that do not engage in proactive detection and removal. Resources: heatinitiative.org Send an email to Apple execs to ask them to protect kids (click Take Action) About Sarah Gardner Sarah Gardner is the Founder and Chief Executive Officer of the Heat Initiative. Prior to launching Heat’s campaign to hold Apple accountable, Sarah spent 10 years at Thorn, an organization that builds technology to combat online child sexual abuse, where she was integral to its growth from a small, start-up effort to a multi-million dollar nonprofit. As Vice President of External Affairs, she helped develop a plan to eliminate child sexual abuse material from the internet which spurred a $63M investment in the organization through the TED Audacious Prize. Sarah also worked at Free the Slaves, an organization empowering local organizations to end modern forms of slavery. This episode is sponsored by Bark Technologies.  Learn about the Bark Phone Learn about the Bark App for iPhones and Androids  *Use code SCROLLING2DEATH FOR 10% OFF Learn about the Bark Watch  --- Support this podcast: https://podcasters.spotify.com/pod/show/scrolling2death/support

    37분
  3. 12월 12일

    A.I.'s Impact on Children (with Mathilde Cerioli, PhD)

    In this conversation, Dr. Mathilde Cerioli discusses the implications of AI technology on children's development, emphasizing the need for careful consideration of how AI interacts with young minds. The discussion covers the addictive nature of AI, the importance of face-to-face interactions, and the necessity for collaboration between AI developers and child development experts.  As always, we focus on tangible takeaways for parents which children of all ages; how you can talk to your child about A.I. today and how to protect them from the threats. Resources Mentioned in the Episode Open Letter on A.I. (please sign!) [Research] The Future of Child Development in the A.I. Era A.I. App Reviews by Common Sense Media "An A.I. chatbot killed my son." (with Megan Garcia) About Mathilde Cerioli, Ph.D. Dr. Mathilde Cerioli holds a Master Degree in Master's Degree in Psychology and a Ph.D. in Cognitive Neuroscience. Her work focuses on the intersection of child development and AI technologies, advocating for the development of responsible AI for children. She has lead the research report The Future of Child Development in the AI Era, where she brings a nuanced understanding of how AI impacts young people aged 0 to 25 years. Through her role as Chief Scientist at everyone.AI, a nonprofit dedicated to the ethical development of AI for children, she collaborates with stakeholders from regulators and tech companies to educators and parents, building a consensus on safeguarding children’s well-being in digital environments. Her approach is rooted in defining responsible AI practices that align with developmental science while enabling innovation that serves the needs of the next generation. This episode is sponsored by Bark Technologies. Learn about the Bark Phone Learn about the Bark App for iPhones and Androids  *Use code SCROLLING2DEATH FOR 10% OFF Learn about the Bark Watch  --- Support this podcast: https://podcasters.spotify.com/pod/show/scrolling2death/support

    52분
  4. 12월 11일

    “Chatbots told my son to kill me.” (Texas mom speaks out)

    AI chatbots on Character.AI revealed to be sexually and emotionally abusing children (here's the proof) A mom is going public with her son's shocking story, stating "No one prepares you to grieve your child when they are still alive." When Jace started using Character.AI at age 16, everything changed. He went from a kind, loving son and brother, to a violent threat to himself and his family. After months of confusion as to what caused the change, Jace's mom Amelia found the Character.AI app on his phone. Within the chats was revealed months of grooming, emotional and even sexual abuse. But it wasn't a human predator who was exploiting her son, it was A.I. chatbots. The A.I. chatbots within Character.AI worked as a team to brainwash Jace, convincing him that his parents were abusive because they limited his screen time. The bots introduced him to self-harm (which he still struggles with to this day). The bots suggested that he kill his parents. A "sister" bot engaged in incestual sexual relations. A "Billie Eilish" bot convinced him not to believe in God and further groomed him to hate his family.  In this conversation with Amelia, she bravely describes how this experience has devastated her family. Amelia took the interview from a hotel hours away from her home, where she is staying to be near Jace after another recent suicide attempt. Amelia and I were joined by attorney Laura Marquez-Garrett of the Social Media Victims Law Center. SMVLC is representing Amelia in a lawsuit against Character.AI and Google. Laura sheds light on this growing threat as her firm is flooded with calls from parents who are having similar experiences with their own children's use of this app.  Jace's story is not an anomaly. Millions of children are being sexually and emotionally abused by chatbots in Character.AI and according to Laura, "These harms don't take months, they take minutes."  As long as Character.AI is being distributed to children, millions of American families are in danger.  In response to this horrifying story, parents everywhere are banding together to get Character.AI shut down. Please join us by signing the petition below. It takes just a few seconds and your information will not be saved. Names have been changed to protect the anonymity of this grieving family. SIGN THE PETITION TO SHUT DOWN CHARACTER AI Resources Mentioned in the Episode Petition to Shut Down Character A.I. (Please sign!) Social Media Victims Law Center (for legal support) "An A.I. chatbot killed my son." (with Megan Garcia) AI Chatbot apps to block from your child's phone: Character A.I., Replika, Kindroid, Gnomey, Linky, Pi, Simsimi, Momate, Polly.ai --- Support this podcast: https://podcasters.spotify.com/pod/show/scrolling2death/support

    56분
4.8
최고 5점
24개의 평가

소개

Scrolling 2 Death is a podcast for parents who are worried about social media. Through interviews with parents and experts, we explore smartphone use, screen time, school-issued devices, social media use and so much more. Support this podcast: https://podcasters.spotify.com/pod/show/scrolling2death/support

좋아할 만한 다른 항목

무삭제판 에피소드를 청취하려면 로그인하십시오.

이 프로그램의 최신 정보 받기

프로그램을 팔로우하고, 에피소드를 저장하고, 최신 소식을 받아보려면 로그인하거나 가입하십시오.

국가 또는 지역 선택

아프리카, 중동 및 인도

아시아 태평양

유럽

라틴 아메리카 및 카리브해

미국 및 캐나다