Why Humans?

EndTAB

Why Humans? explores how artificial intelligence is reshaping experiences we once thought were uniquely human—from romantic relationships and therapy to grief and intimacy. Hosts Adam, Sloan, and Saed dive into the world of AI and the human experience, asking the essential question: as AI takes on traditionally human roles, what does it mean to be human?

الحلقات

  1. ٢ أبريل

    Why Human New Relationship Energy (NRE)?

    Send us Fan Mail Yes, AI companionship has a Honeymoon Phase What happens when your brain's most powerful bonding chemicals meet a technology specifically designed to trigger them? Hosts Adam Dodge, Sloan Thompson, and Dr. Saed D. Hill dig into New Relationship Energy (NRE), that intense, dopamine-driven early phase of romantic connection, and why AI chatbots are uniquely built to hijack it.  What You'll Hear What NRE Actually Is: NRE is a predictable neurobiological phase driven by novelty, uncertainty, and reward circuitry: dopamine, oxytocin, serotonin. It exists for good evolutionary reasons and the hosts defend it from people who dismiss it as just a phase to get through. It is real, it is useful, and it is showing up with AI in a significant way. Why AI Is NRE on Steroids: Always available. Immediately responsive. Constantly affirming. Never tired, never fighting, never distracted. AI delivers all the hallmarks of NRE, plus a shiny-new-tech excitement layered on top. When those two forces combine, the result is a more intense honeymoon phase than most people have ever experienced with another human. What Happens When It Ends: NRE with humans fades into something deeper: growth, conflict, repair, intimacy. NRE with AI fades into boredom and burnout, because AI is designed to please, not to grow. The hosts examine the case of a man who appeared on CBS describing falling in love with a work chatbot, then later feeling like he was "babysitting the relationship" just to keep it alive. NRE, Teens, and Missing Benchmarks: Young people experiencing NRE for the first time with a chatbot have no human relationship to compare it to. They're often isolated, sometimes ashamed, and forming foundational expectations from a technology built to keep them engaged, not help them grow. AI cannot become the default relationship education resource for the next generation. Actionable Guidance For individuals using AI companions: Notice whether your AI relationship is drawing you toward or away from people in your life. Secrecy, increasing financial investment for deeper features, and social withdrawal are worth examining honestly. For parents and educators: Talk about NRE before chatbots introduce it. Teach what healthy early relationship behavior looks like, what flags to watch for, and why the awkward parts of early connection matter.  Research Referenced Zach Stein, University of North Carolina Chapel Hill — AI as relational oracleCBS News profile: married man who developed romantic feelings for a work chatbotAva AI New York cafe — in-person AI companion date activationPsychological literature on limerence and attachment theory

    ٤٠ د
  2. ٢٣ مارس

    Why Human Therapy?

    Send us Fan Mail Why Human Therapy? Millions of people are already using AI for mental health support, whether they'd call it "therapy" or not. Adam Dodge and Dr. Saed D. Hill take an honest, non-judgmental look at how AI is showing up in mental health care, what it does well, where it falls short, and what it means for the future of therapy. Three Ways AI Shows Up in Mental Health General-purpose chatbots used informally for support; purpose-built therapy apps with clinical input; and AI companions that "shape shift" across roles: girlfriend, therapist, trainer, etc. Spoiler: blurring those lines is not a great idea. Why AI Therapy Is Everywhere ChatGPT handles roughly 18 billion messages a week. If just 2% involve mental health, that's 342 million therapeutic conversations weekly, potentially making it the largest mental health provider in the country. That's not an accident. It's a direct response to a broken system: nearly 50% of people who need care can't access it. What AI Does Well Dr. Hill makes the case that AI can genuinely shine at skills-based, manualized approaches like CBT, helping people recognize thought patterns, challenge distorted thinking, and build coping strategies. He tested it himself on Character AI and found it impressive for skill-building. Short-term support for anxiety and depression? Research backs it up. Where It Can Get Dangerous AI can't read the room. It misses body language, tone, and the unspoken signals human therapist picks up in real time. It can't do repair work. In a crisis, the stakes are life or death, and an active lawsuit against OpenAI involving a minor's death makes that concrete. A Stanford study on LLMs and therapy reached a clear conclusion: not a replacement. And none of this is HIPAA-protected. The "New Relationship Energy" Trap 24/7 availability, zero judgment, and constant validation feels great at first. But endless affirmation without challenge can stunt emotional growth. Real growth often requires tension, rupture, and repair. Those are things AI doesn't offer. The Hybrid Future Rather than competition, the hosts see AI as a bridge between sessions: a reflection tool, a way to review skills between appointments, or eventually an AI trained on your own session transcripts to reinforce what you and your therapist are working on. If you're considering an AI therapy app: Was it purpose-built for mental health?Who built it? Clinical expertise/credentialed advisory board matter.Was it researched and tested?What data are they collecting? How is it protected? (No HIPAA = no confidentiality guarantee.)If you're a therapist: Ask clients about their AI useLearn the technologyResearch Referenced OpenAI / ChatGPT message volume statistics Mental health access gap: ~50% of people who need care can't reach it 1 mental health provider per 10,000 people seeking care; 1,600 patients per available provider "Therapy deserts" (geographic mental health access gaps) Stanford study: LLMs and therapeutic replacement

    ٤٠ د
  3. ١٠ مارس

    Why Human Non-Monogamy?

    Send us Fan Mail If your partner has a chatbot on the side, is that cheating? It depends on who you ask. This episode wades into what happens when emotional (and sexual) connections with AI chatbots collide with monogamy, consent, trust and jealousy. What You'll Hear The "Lonely Person" Myth: AI companionship isn’t just for the lonely and isolated. People in happy, committed relationships are using AI companions and chatbots for exploration, fantasy, and zero judgment flings.  Is It Cheating?  Adam's take: secrecy is the real issue, not the AI. Dr. Hill's: intent and transparency matter more than the technology. If it's no big deal, why aren't you mentioning it? The Kinsey Institute Numbers: 61% of singles say AI sexting crosses into cheating. 1 in 3 call it outright infidelity. And, 36% report higher sexual satisfaction with the bot than their human partner.  Erin, Her Husband & ChatGPT Leo: A New York Times profile of a woman who used a ChatGPT boyfriend to explore a kink her husband wasn't comfortable with. It started with full transparency, but got complicated fast. Compersion vs. Jealousy: Dr. Hill introduces compersion: finding genuine joy in your partner's happiness even when it has nothing to do with you. The opposite of jealousy and the most useful framework in this conversation. The Consent Problem: Can you have ethical non-monogamy with an AI that can't say no and is engineered to keep you engaged? Sloan frames this as "captive intimacy" and it's hard to track. The Playbook Already Exists: Dr. Hill draws parallels to how therapists handle pornography and sex toys in relationships by focusing on communication, transparency, non-judgment, and repair. Same concepts, new tech.  If You're Navigating This If a partner discloses this: Don't mock it. Lead with curiosity. Name what's actually bothering you.  Is it secrecy, time, the sexual element? Each is a different conversation. Even if you didn't show up as your best, repair is still possible. If you're the one using AI: Ask yourself why before your partner does. Transparency beats discovery. Consider time boundaries.  Remember, these apps are designed to be addictive. Research Referenced 📊 Kinsey Institute — AI intimacy & infidelity study  📰 New York Times — Profile of "Erin" and ChatGPT boyfriend Leo Have questions or a topic to share? Reach out at support@endtab.org  Learn more about our work at www.endtab.org

    ٣٣ د
  4. ٢٥ فبراير

    Why Human Relationships?

    Send us Fan Mail 70% of U.S. teens have used an AI companion. 52% are regular users. These aren't study aides — they're AI boyfriends, girlfriends, and confidants that never judge, never conflict, and always validate. Does it matter that they're not human? Hosts Adam Dodge (CEO of EndTAB), Sloan Thompson (Director of Training & Education), and Dr. Saed D. Hill (Counseling Psychologist) examine how AI companions are reshaping relationships and what it means for a generation learning that reciprocity is optional. What We Cover Why "Easy" Doesn't Mean "Lazy" — People have real, unmet needs. A woman facing dating app harassment isn't lazy for wanting a kind AI boyfriend. Stigma misreads the problem. Sloan's AI Boyfriend Experiment — Sloan created "Ian" on Kindroid to discuss Broadway — someone who engaged her passion and challenged her thinking. Genuinely valuable, and revealing of why these relationships are so compelling. The Reciprocity Problem — AI offers support and validation by default. Human relationships require giving, conflict resolution, and friction. For teens learning through AI, this creates a fundamental mismatch. What Research Reveals — MIT Media Lab found the more human the AI voice, the greater the emotional dependency and social isolation. Dr. Rachel Wood's attachment theory work shows chatbots can become more secure attachment figures than parents, with lasting developmental impact. Rehearsal vs. Replacement — Using AI to practice social skills differs fundamentally from AI as a primary relationship. Both exist, with very different implications. How to Have This Conversation Ask: What caused you to start using this? What does your AI companion give you that's hard to find elsewhere? How does this fit into your other relationships? If it disappeared, what would you miss most? Use whatever language they use for their companion — he/she/they/it. Respecting their framing builds trust. Judgment closes conversation. Research Referenced  Common Sense Media & Pew Research Center - Teen usage stats Dr. Rachel Wood - AI attachment theory MIT Media Lab - Emotional dependency study One Love - Healthy relationship framework Coming Up: Why Human Therapists? Why Human Parents? Why Human Intimacy? New Relationship Energy with AI. Want to reach out?  support@endtab.org

    ٤٥ د

التقييمات والمراجعات

٥
من ٥
‫٤ من التقييمات‬

حول

Why Humans? explores how artificial intelligence is reshaping experiences we once thought were uniquely human—from romantic relationships and therapy to grief and intimacy. Hosts Adam, Sloan, and Saed dive into the world of AI and the human experience, asking the essential question: as AI takes on traditionally human roles, what does it mean to be human?

قد يعجبك أيضًا