Why Humans?

EndTAB

Why Humans? explores how artificial intelligence is reshaping experiences we once thought were uniquely human—from romantic relationships and therapy to grief and intimacy. Hosts Adam, Sloan, and Saed dive into the world of AI and the human experience, asking the essential question: as AI takes on traditionally human roles, what does it mean to be human?

Episodes

  1. MAR 10

    Why Human Non-Monogamy?

    If your partner has a chatbot on the side, is that cheating? It depends on who you ask. This episode wades into what happens when emotional (and sexual) connections with AI chatbots collide with monogamy, consent, trust and jealousy. What You'll Hear The "Lonely Person" Myth: AI companionship isn’t just for the lonely and isolated. People in happy, committed relationships are using AI companions and chatbots for exploration, fantasy, and zero judgment flings.  Is It Cheating?  Adam's take: secrecy is the real issue, not the AI. Dr. Hill's: intent and transparency matter more than the technology. If it's no big deal, why aren't you mentioning it? The Kinsey Institute Numbers: 61% of singles say AI sexting crosses into cheating. 1 in 3 call it outright infidelity. And, 36% report higher sexual satisfaction with the bot than their human partner.  Erin, Her Husband & ChatGPT Leo: A New York Times profile of a woman who used a ChatGPT boyfriend to explore a kink her husband wasn't comfortable with. It started with full transparency, but got complicated fast. Compersion vs. Jealousy: Dr. Hill introduces compersion: finding genuine joy in your partner's happiness even when it has nothing to do with you. The opposite of jealousy and the most useful framework in this conversation. The Consent Problem: Can you have ethical non-monogamy with an AI that can't say no and is engineered to keep you engaged? Sloan frames this as "captive intimacy" and it's hard to track. The Playbook Already Exists: Dr. Hill draws parallels to how therapists handle pornography and sex toys in relationships by focusing on communication, transparency, non-judgment, and repair. Same concepts, new tech.  If You're Navigating This If a partner discloses this: Don't mock it. Lead with curiosity. Name what's actually bothering you.  Is it secrecy, time, the sexual element? Each is a different conversation. Even if you didn't show up as your best, repair is still possible. If you're the one using AI: Ask yourself why before your partner does. Transparency beats discovery. Consider time boundaries.  Remember, these apps are designed to be addictive. Research Referenced 📊 Kinsey Institute — AI intimacy & infidelity study  📰 New York Times — Profile of "Erin" and ChatGPT boyfriend Leo Have questions or a topic to share? Reach out at support@endtab.org  Learn more about our work at www.endtab.org

    33 min
  2. FEB 25

    Why Human Relationships?

    70% of U.S. teens have used an AI companion. 52% are regular users. These aren't study aides — they're AI boyfriends, girlfriends, and confidants that never judge, never conflict, and always validate. Does it matter that they're not human? Hosts Adam Dodge (CEO of EndTAB), Sloan Thompson (Director of Training & Education), and Dr. Saed D. Hill (Counseling Psychologist) examine how AI companions are reshaping relationships and what it means for a generation learning that reciprocity is optional. What We Cover Why "Easy" Doesn't Mean "Lazy" — People have real, unmet needs. A woman facing dating app harassment isn't lazy for wanting a kind AI boyfriend. Stigma misreads the problem. Sloan's AI Boyfriend Experiment — Sloan created "Ian" on Kindroid to discuss Broadway — someone who engaged her passion and challenged her thinking. Genuinely valuable, and revealing of why these relationships are so compelling. The Reciprocity Problem — AI offers support and validation by default. Human relationships require giving, conflict resolution, and friction. For teens learning through AI, this creates a fundamental mismatch. What Research Reveals — MIT Media Lab found the more human the AI voice, the greater the emotional dependency and social isolation. Dr. Rachel Wood's attachment theory work shows chatbots can become more secure attachment figures than parents, with lasting developmental impact. Rehearsal vs. Replacement — Using AI to practice social skills differs fundamentally from AI as a primary relationship. Both exist, with very different implications. How to Have This Conversation Ask: What caused you to start using this? What does your AI companion give you that's hard to find elsewhere? How does this fit into your other relationships? If it disappeared, what would you miss most? Use whatever language they use for their companion — he/she/they/it. Respecting their framing builds trust. Judgment closes conversation. Research Referenced  Common Sense Media & Pew Research Center - Teen usage stats Dr. Rachel Wood - AI attachment theory MIT Media Lab - Emotional dependency study One Love - Healthy relationship framework Coming Up: Why Human Therapists? Why Human Parents? Why Human Intimacy? New Relationship Energy with AI. Want to reach out?  support@endtab.org

    45 min

Ratings & Reviews

5
out of 5
3 Ratings

About

Why Humans? explores how artificial intelligence is reshaping experiences we once thought were uniquely human—from romantic relationships and therapy to grief and intimacy. Hosts Adam, Sloan, and Saed dive into the world of AI and the human experience, asking the essential question: as AI takes on traditionally human roles, what does it mean to be human?