The Terms of Service Podcast

Mary Camacho

Subscribe, rate, and share to support the show on Apple Podcasts, Spotify, PocketCast or wherever you listen. Follow us on LinkedIn for updates and join the conversation. Welcome to “Terms of Service,” the podcast that dives deep into the fine print of our digital lives. Every time we check the box on an app, website, or online service, we’re making choices—often without knowing the full story. From giving away our privacy to navigating complex security settings, we’re all part of a digital landscape that’s constantly evolving. Join us as we unpack the themes that shape our online experiences: privacy, security, safety, and the everyday permissions we grant without a second thought. We’ll explore how AI, agency, and decentralized technologies are reshaping our digital world, often in ways that fly under the radar. And because no conversation about our digital lives would be complete without it, we’ll tackle the legal and policy implications that come with our clicks, swipes, and taps. Whether you’re tech-savvy or just trying to keep up, “Terms of Service” invites you to join the conversation about the hidden costs of convenience in the digital age. Tune in to explore, question, and rethink the terms we often accept without a second thought, and let’s challenge the norms of our digital lives together. CreditsProduced by Mary Camacho & Nicole Klau Ibarra. Music and sound production is by Arthur Vincent at Sonorlab. Behind the Mic Co-founder of Holochain, and CEO of Holo, Mary leads the development of peer-to-peer and decentralised technologies that empower users and redefine digital interactions. With over 20 years in tech and telecom, her career has been dedicated to enhancing user control, privacy, and digital autonomy. Mary's educational background as a social scientist grounds the explorations at intersection of sociality and technology, exposing the trade-offs in privacy, security, and agency inherent in our digital choices. On “Terms of Service,” she invites listeners to rethink these everyday interactions and the broader implications of AI, distributed tech, and legal frameworks on our digital lives, advocating for a future where individuals have greater control over their data and decisions. https://www.linkedin.com/in/maryfcamacho/ Nicole is a visionary entrepreneur with a diverse background, she is passionate about social system design and has helped multiple ventures, Including the IKIGAI Project, her non-profit helping people build essential 21st-century skills- many of which intersect with the topics discussed in Terms Of Service. https://www.linkedin.com/in/nicole-klau-ibarra-b26818137/ Arthur Vincent is a seasoned Music and Audio Producer with a passion for pushing the boundaries of music technology. As a music producer and sound designer, he has crafted innovative audio experiences for global brands like Heineken, Philips, and Cupra. Alongside his creative work, Arthur is also an expert in audio technology, mastering both hardware and software tools to deliver high-quality, immersive sound. https://www.linkedin.com/in/arthur-vincent/

  1. Environments Are Not Neutral: Biology, Burnout, and the Design of Work with Dr. Elizabeth C. Nelson

    FEB 25

    Environments Are Not Neutral: Biology, Burnout, and the Design of Work with Dr. Elizabeth C. Nelson

    In this episode of Terms of Service, host Mary Camacho speaks with Dr. Elizabeth C. Nelson, biomedical engineer and founder of Learn, Adapt, Build, about a deceptively simple idea: environments are not neutral. From open office layouts to wearable wellness metrics, the spaces and systems we design encode assumptions about who they are built for—and who must adapt to survive inside them. Drawing from her own burnout experience and years of research bridging academia and practice, Elizabeth explains how modern workplaces often optimize for the most resilient minority rather than the majority. They explore how environmental design affects stress, cognition, sleep, and performance; why high performers are often the first to hit the wall; and how leadership teams can make practical, measurable changes that improve both well-being and output. This conversation extends this season’s focus on health technology governance into the physical workplace itself — asking how environmental measurement, workplace design, and performance metrics can either support human thriving or quietly optimize for institutional control. Key TakeawaysEnvironments are not neutral. Physical layout, lighting, noise, air quality, and collaboration norms encode assumptions about the “default worker.”We do not design for the average. Many workplaces are optimized for the most resilient employees, not the most sensitive—despite evidence that designing for the sensitive improves outcomes for everyone.Focus is biologically powerful. Deep work and flow states (often lasting 60–90 minutes) support cognitive performance and emotional regulation. Constant interruption erodes both.Burnout is not a binary. It develops over time and often affects high performers who overextend without adequate recovery.Measurement can validate—or destabilize. Environmental sensors and wearables can reconnect people to their bodies, but poorly framed metrics can create shame or disconnect (as seen in early 10,000-step tracking experiences).Small structural changes matter. Separating deep-focus roles from interruption-heavy roles, improving air quality transparency, and removing unnecessary management friction can significantly improve performance and morale.Topics Covered / Timestamped Sections00:00 – Season framing: architecture, wellness technology, and why environments matter04:00 – Burnout as origin story and the shift from academia to workplace research06:00 – Open offices, evolutionary biology, and why protection and cover matter12:00 – The cultural loss of focus and the cost of constant collaboration19:00 – Burnout as a gray zone and the biological role of sleep22:00 – Wearables, recalibrating step goals, and the psychology of measurement27:00 – Air quality sensors, transparency, and the “Butterfly Air” example33:00 – Designing for the most sensitive rather than the most resilient49:00 – Case study: separating engineers from interruption-driven roles56:00 – Leading with biology: why design becomes easier when aligned with human instinctsGuest Bio and LinksDr. Elizabeth C. Nelson is a biomedical engineer, researcher, and founder of Learn, Adapt, Build. Her work bridges scientific research and real-world application, focusing on workplace design, burnout prevention, environmental measurement, and biological alignment. She advises leadership teams and organizations on how to create spaces that support focus, recovery, and sustainable performance. Website: https://learnadaptbuild.comLinkedIn: https://www.linkedin.com/in/drelizabethnelson/Book: The Healthy Office RevolutionResources MentionedThe Healthy Office Revolution by Dr. Elizabeth C. Nelson Atomic Habits (referenced in discussion about incremental change) Smart Building Collective (Elizabeth’s professional affiliation) Workplace environmental testing (CO₂, air quality, light disturbance measurement) Call to ActionIf environments are not neutral, then design is a form of leadership. What assumptions are encoded in your workplace—or in the technologies you use every day? This episode invites you to rethink performance, burnout, and biology through the lens of space itself. 🎧 Listen now: https://termsofservice.xyz/ CreditsHost: Mary Camacho Guest: Dr. Elizabeth C. Nelson Produced by Terms of Service Podcast Sound Design: Arthur Vincent and Sonor Lab Co-Producers: Nicole Klau Ibarra & Mary Camacho

    52 min
  2. Femtech's Reckoning: Privacy, Power, and Protection in Health Technology with Soribel Feliz

    JAN 29

    Femtech's Reckoning: Privacy, Power, and Protection in Health Technology with Soribel Feliz

    In this episode of Terms of Service, host Mary Camacho speaks with Soribel Feliz—AI governance and tech policy advisor—about the dangerous gaps between what health technology promises and what privacy law actually protects. Drawing from her experience advising on AI and emerging tech across the U.S. Senate, federal government, and Big Tech, Soribel examines how femtech and wellness apps claim to empower women while selling their most intimate data and leaving them vulnerable to law enforcement. This conversation starts with a moment at a pitch competition: a pregnancy app founder dismissed a question about law enforcement access with "we're HIPAA compliant" and turned away. That turning away from hard questions reveals the problem. As 210 pregnant women face criminal charges using data from apps that promised empowerment, this episode asks: what would it take to build health technology that actually protects the people who use it? Key TakeawaysHIPAA compliance doesn't mean privacy protection. Most consumer health apps aren't covered entities and can sell your data freely.Your health data is being sold without meaningful consent. Period tracking apps sold location data to anti-abortion organizations to target women visiting Planned Parenthood.Pregnancy loss can become criminal evidence. 210 pregnant women faced criminal charges using app data in the first year after Dobbs—HIPAA offered zero protection.Compliance ≠ actual protection. Checking regulatory boxes doesn't mean users are safe. Founders and investors must ask harder questions.Algorithms are personal. From hiring discrimination to insurance denials, AI systems make intimate decisions about people's lives with little transparency.Topics Covered / Timestamped Sections02:40 – The pitch competition moment: when "HIPAA compliant" became a shield against accountability04:31 – What HIPAA actually does and doesn't do08:49 – Period tracker data sold to Wisconsin Right to Life for anti-abortion targeting16:30 – Why consumer health apps aren't covered by HIPAA20:15 – Law enforcement access and pregnancy loss as criminal evidence32:18 – ChatGPT Health and the risks of sharing complete medical records37:50 – What Soribel learned in the Senate, at Meta, and Microsoft43:45 – Algorithms are personal: Workday's hiring discrimination lawsuit46:23 – Advice for founders: put your money where your mouth is50:26 – Follow Soribel's work on LinkedIn, Substack, and YouTubeGuest Bio and LinksSoribel Feliz – AI governance and tech policy advisor with experience advising on AI and emerging tech across the U.S. Senate, federal government, and Big Tech. She focuses on how AI systems create legal, ethical, and operational risk, especially in health tech and femtech, and how organizations can govern AI responsibly at scale. Her work highlights where privacy law and AI governance fall short and why robust governance frameworks matter now. LinkedIn: https://www.linkedin.com/in/soribelfeliz/Substack: https://soribelfeliz.substack.com/YouTube: Algorithms are Personal - https://studio.youtube.com/channel/UC4gbKlwc5VQ6AeCq7kH0ZHAResources Mentioned"So You Got the Privacy Officer Title - Now What?" by Teresa "T" Froester-FalkNear Intelligence / Wisconsin Right to Life – Senator Wyden's investigation into location data salesMobley v. Workday – AI hiring discrimination lawsuitPregnancy Justice Report – 210 criminal charges post-DobbsFurther Reading / Related EpisodesSoribel's "Femtech Reckoning" series on SubstackCall to ActionWhat does it mean to build health technology that actually protects the people who use it? Soribel Feliz offers a clear-eyed examination of where femtech is failing—and what it would take for founders, investors, and policymakers to ask the hard questions. CreditsHost: Mary Camacho Guest: Soribel Feliz Produced by Terms of Service Podcast Sound Design: Arthur Vincent and Sonor Lab Co-Producers: Nicole Klau Ibarra & Mary Camacho

    52 min
  3. Changing Minds and Making Space: Curiosity, Emotion, and Democracy with Dr. Sarah Stein Lubrano

    11/26/2025

    Changing Minds and Making Space: Curiosity, Emotion, and Democracy with Dr. Sarah Stein Lubrano

    In this episode of Terms of Service, host Mary Camacho speaks with Dr. Sarah Stein Lubrano—author of Don’t Talk About Politics: How to Change 21st Century Minds—about what it takes to think, connect, and persuade in a time of rapid technological and cultural disruption. Drawing from her background in philosophy, psychology, and political theory, Sarah explores how emotions shape our cognition, why curiosity is a democratic virtue, and how design and technology can either open or close off possibilities for shared understanding. Together, they examine how modern systems—from social media to AI agents—can reduce nuance, flatten emotional range, and reward performance over reflection. This conversation invites us to think more deeply about how we encounter difference—and what it takes to stay open when the world feels overwhelming. Key TakeawaysChanging minds isn’t about winning arguments. It starts with curiosity, emotional intelligence, and building the cognitive space for reflection.Democracy requires mental infrastructure. That means not just freedom of speech, but the psychological and social capacity to listen, consider, and evolve.AI and social platforms risk “flattening” cognition. Speed and frictionless interaction can reduce the emotional and epistemic range of public discourse.Design can support or inhibit dignity. How we architect systems of learning, debate, or health shapes what kinds of people and conversations they enable.We don’t need agreement to coexist. But we do need structures that protect space for difference—both in ideas and identities.Topics Covered / Timestamped Sections02:10 – Sarah’s intellectual path: from Oxford and Harvard to emotional epistemology and political learning04:24 – Why she wrote Don’t Talk About Politics and what “changing minds” really involves.13:30 – How certain academic and tech cultures mistake argument for insight, and why more discussion doesn’t necessarily lead to understanding or change.17:50 –The tension between emotional speed and civic depth — what technology amplifies, and what it erodes.24:06 – Designing for reflection: what it takes to build platforms that support empathy, not outrage.39:11 – Bringing emotional education into institutions, policymaking, and design.46:01 –Reflections on where we go from here — cultivating the emotional capacity democracy requires.Guest Bio and LinksDr. Sarah Stein Lubrano – Researcher, educator, and author focused on the psychology of political learning and epistemic humility. She holds a doctorate from Oxford and is the author of Don’t Talk About Politics: How to Change 21st Century Minds. Sarah’s WebsiteDon’t Talk About Politics – Book LinkResources MentionedThe School of Life – Where Sarah developed emotional learning contentTrauma-informed pedagogy – Educational design that recognizes emotional safety and regulation.Patient experience research – How listening and context shape clinical outcomes.AI as cognitive scaffolding – The potential and risks of AI agents in deliberative thinking.Further Reading / Related EpisodesEpisode 6: "Emotional Intelligence in the Age of AI: A Conversation with Marisa Zalabak".Episode 7: "Who Watches the Watchers? Privacy Law, AI, and Power with William McGeveran"Call to ActionHow do we create room for real thought—and for each other—in an age of constant noise? Dr. Sarah Stein Lubrano offers a thoughtful and hopeful path forward, grounded in emotion, curiosity, and civic design. 🎧 Listen now: Episode Link CreditsHost: Mary Camacho Guest: Dr. Sarah Stein Lubrano Produced by Terms of Service Podcast Sound Design: Arthur Vincent and Sonor Lab Co-Producers: Nicole Klau Ibarra & Mary Camacho

    59 min
  4. You Don’t Own It If You Can’t Fix It: The Fight for the Right to Repair

    09/19/2025

    You Don’t Own It If You Can’t Fix It: The Fight for the Right to Repair

    Episode SummaryIn this episode of Terms of Service, host Mary Camacho speaks with Gay Gordon-Byrne, Executive Director of the Digital Right to Repair Coalition, about how manufacturers are rewriting the rules of ownership in the digital age. Drawing on decades of experience in enterprise computing and leasing, Gay shares how restrictive repair policies—hidden behind software locks, proprietary tools, and legal fine print—are quietly eroding our rights as consumers. From absurd real-world examples to legislative progress across the U.S., this conversation reveals what’s at stake when we lose the ability to fix the things we own—and how the Right to Repair movement is pushing back. Key TakeawaysRepair is a right, not a loophole. Companies have used copyright law, contracts, and DRM to block basic repairs—redefining ownership in the process.You don’t void your warranty by repairing your own device. Under U.S. law, that’s been protected since the 1970s.Tractors, phones, and dishwashers now run on software. That means repair is increasingly a legal and digital issue, not just mechanical.Fixing things is a cultural practice. It's being squeezed out by design, but it offers economic, environmental, and emotional benefits.State-level legislation is gaining traction. While federal regulators stall, local organizing and public pressure are driving change.Topics Covered / Timestamped Sections04:30 – Understanding Right to Repair - From leasing and enterprise sales to grassroots repair advocacy.08:10 – The slow erosion of repair rights through software and service bundling10:50 – What “Right to Repair” actually means—and what it doesn’t12:52 – The Shift in Consumer Expectations.13:50 – The Economics of Repairability.15:40 – Legal Implications of Ownership.19:00 – Tractors, cars, and consumer electronics: software as the new lock24:00 – The Global Perspective on Repair Culture.27:26 – The Magnuson-Moss Warranty Act and the myth of “voided” warranties.28:00 – Legislative Changes and Consumer Power31:25 – Antitrust and tying agreements: the legal dimension of forced service35:05 – The Role of Consumers in Advocacy- France’s repairability index and global momentum for consumer rights.45:38 – Stories from the field: absurd repair scenarios and growing public awareness.Guest Bio and LinksGay Gordon-Byrne – Executive Director of the Digital Right to Repair Coalition (Repair.org). With decades of experience in the computer leasing industry, Gay has spent the past decade fighting to restore ownership and repair rights for consumers and independent businesses across the U.S. Repair.org Gay Gordon-Byrne on LinkedInResources MentionedMagnuson-Moss Warranty Act (FTC.gov) – Protecting U.S. consumers from deceptive warranty practicesFrance’s Repairability Index – Labeling systems that inform buyers on repair potentialiFixit – Repair guides, community support, and advocacyFurther Reading / Related EpisodesEpisode 3: “Empowerment Tech: Unlocking Customer Data for Better Choices and Better Business”Call to ActionWhat if you couldn’t fix your own tools, car, or phone—even when it’s a simple repair? Listen to Gay Gordon-Byrne explain why the right to repair is about more than gadgets—it’s about autonomy, sustainability, and democratic accountability. 🎧 Listen now: Episode Link CreditsHost: Mary Camacho Guest: Gay Gordon-Byrne Produced by Terms of Service Podcast Sound Design: Arthur Vincent and Sonor Lab Co-Producers: Nicole Klau Ibarra & Mary Camacho

    55 min
  5. Designing Privacy You Can Feel: Smooth, Supportive, Empowering

    08/06/2025

    Designing Privacy You Can Feel: Smooth, Supportive, Empowering

    Episode SummaryIn this episode of Terms of Service, host Mary Camacho speaks with Molly Willson and Eriol Fox from Superbloom, a nonprofit design and technology studio working at the intersection of open-source software, privacy, and human rights. Together, they unpack the Privacy Experience Heuristics—a framework designed to help teams build more intuitive, trust-centered experiences around privacy. They explore why legal compliance isn’t enough, how tools like password managers and secure messaging apps can feel intimidating or unsafe, and why it’s crucial to center marginalized users in privacy and security design. From “personas non grata” to designing for digital dignity, this conversation explores how we can bridge the gap between secure systems and the real people who need them most. Key TakeawaysPrivacy isn’t just technical—it’s emotional and relational. Smooth, supportive, and empowering experiences help users trust and engage with privacy-respecting tools.The Privacy Experience Heuristics were created to guide open-source and nonprofit teams in building better UX for privacy without requiring specialized expertise.Designers have a critical role in shaping security culture and making privacy feel accessible, not punitive.Marginalized communities often bear the brunt of poor defaults and unsafe assumptions. Designing with their safety in mind improves tools for everyone.Security isn’t one-size-fits-all. Empowerment means giving users choices without overwhelming them with complexity.Topics Covered / Timestamped Sections03:30 – How Molly and Eriol came to focus on privacy-centered design08:56 – Why compliance frameworks (like GDPR) don’t ensure a good user experience.10:22 – Introducing the Privacy Experience Heuristics: smooth, supportive, empowering.16:08 – The difference between supportive and empowering.21:00 – Human-centered design doesn't start and end with the users.23:27 – Designing for safety: why privacy must serve people on the margins.27:38 – Should people have to worry about privacy?31:30 – Personas Non Grata: preparing for misuse and unexpected users.36:21 – Real world examples where privacy or security is being built into design.43:38 –Why can't you split the world into 'people who need privacy' and 'people who don’t?44:30 – WhatsApp, Signal and the difference between them. 56:00 – Hope for the future: reframing privacy as a shared cultural valueGuest Bio and LinksMolly Willson – Molly has been at Superbloom since 2018, where she leads design and research projects around a variety of open-source and public interest technology. She has worked with teams on projects around privacy, security, transparency, open data, and internet governance, and has also done research projects together with funders and communities working in these areas. She also leads Superbloom's coaching program, helping pair experts with teams for high-impact design, community, and fundraising mentoring. Her background is in both design and education, making her particularly passionate about making design useful to everyone looking to create rights-friendly alternatives to big tech platforms. Before she joined Superbloom, she taught design at the Stanford d.school and the Hasso-Plattner-Institut at the University of Potsdam. She is originally from the US but has lived in Berlin, Germany since 2015, where she lives with her husband and her two daughters. Eriol Fox – Eriol has been working as a designer for 15+ years working in for-profits and then NGO's and open-source software organisations, working on complex problems like sustainable food systems, peace-building and crisis response technology. Eriol now works at Superbloom design, research, open-source and technology projects. They are also part of the core teams at Open Source Design (http://opensourcedesign.net/) and Human Rights Centred Design working group (https://hrcd.pubpub.org/) and Sustain UX & Design working group (https://sustainoss.org/working-groups/design-and-ux/) and help hosts podcast about open source and design (https://sosdesign.sustainoss.org/) Eriol is a non-binary, queer person who uses they/them pronouns. Superbloom WebsitePrivacy Experience Heuristicshttps://github.com/sprblm/The-Design-We-OpenResources MentionedSignal – Encrypted messaging with strong privacy defaultsTor Browser – Privacy-first web browsingGDPR – European data protection law, often insufficiently implemented in UXFirefox, Proton, KeePassXC – Examples discussed throughout.Personas Non GrataFurther Reading / Related EpisodesEpisode 2: "Beyond Honeypots: Privacy, Security, and the Future of Distributed Webs"Episode 8: "The Great Disruption: Building Human-Centered Digital Futures"Call to ActionHow does privacy feel when you use your favorite app? Is it smooth? Supportive? Empowering? Molly and Eriol challenge us to design not just for policy, but for people. Listen to this episode and explore how design can help us reclaim digital agency. 🎧 Listen now: Episode Link CreditsHost: Mary Camacho Guests: Molly Willson & Eriol Fox Produced by Terms of Service Podcast Sound Design: Arthur Vincent and Sonor Lab Co-Producers: Nicole Klau Ibarra & Mary Camacho

    1h 6m
  6. Mission, Complexity, and Crisis: Leading in a Rapidly Changing World

    07/02/2025

    Mission, Complexity, and Crisis: Leading in a Rapidly Changing World

    Episode SummaryIn this episode of Terms of Service, host Mary Camacho speaks with Dr. David Bray, a seasoned leader who has served in senior roles across the U.S. government, tech, and civil society. From bioterrorism response at the CDC to digital transformation efforts in national intelligence, Bray brings a unique perspective on leadership in complexity. They explore how institutions can adapt in times of disruption, why trust is a critical infrastructure, and how positive change agents can build bridges across sectors—even in polarized environments. With a deep systems lens, Bray challenges us to align technological innovation with human values and long-term mission. Key Takeaways Mission-driven leadership matters most in times of complexity and crisis. Leaders must be able to hold contradictions, listen deeply, and navigate uncertainty with clarity of purpose.Trust is infrastructure. Societal systems—especially in democracies—depend on mutual trust, and technology can either degrade or strengthen that foundation.The U.S. is structured for stalemate, not for rapid transformation. But transformation is still possible—especially in crises—through coalitions and adaptive strategies.Cross-sector collaboration is essential. Government, civil society, and private enterprise must learn to speak a shared language of values and resilience.We must redesign metrics for success. Quarterly profits aren’t the only or best measure; we need frameworks that value long-term human and ecological well-being.Topics Covered / Timestamped Sections02:52 – The Neutrality of Technology and Its Implications06:30 – Agency in the Age of AI and Information09:32 – Policy Evolution in the Face of Rapid Technological Change13:43 – Building Trust Across Divided Sectors14:25 – When institutions break down: adaptive leadership and finding windows of possibility18:10 – Personal Journeys and Motivations in Leadership. 22:48 – Advice for Leaders Amidst Polarization23:20 – Navigating polarized environments with shared values and pluralist frames28:10 – Decision-Making Frameworks for Leaders30:15 – Fostering Healthy Tension in Leadership32:09 – Empowering Others and Agency in Leadership39:18 – The Ethics of Power and ResponsibilityGuest Bio and LinksDr. David Bray is a strategist and transformation leader working at the intersection of technology, policy, and complex change. Currently Distinguished Chair of the Accelerator at the Stimson Center and Principal at LeadDoAdapt Ventures, he’s led efforts ranging from bioterrorism preparedness to countering disinformation for U.S. Special Operations. A former FCC CIO and Executive Director for bipartisan national commissions, David has advised 12 startups, worked globally on the future of tech and data, and earned honors including the National Intelligence Exceptional Achievement Medal and CIO 100 Awards. He’s also served as Executive-in-Residence at Harvard and was named one of Business Insider’s “24 Americans Changing the World.” David Bray on LinkedIn CXO TALKStimson Center – Bray’s ProfileResources MentionedPeople-Centered Internet Coalition - Dr. Bray served as Executive Director for this initiative co-founded by Vint Cerf. It promotes digital infrastructure that empowers people. Edelman Trust Barometer – He references the 2025 edition of the Edelman Trust Barometer, particularly noting statistics on global grievance and willingness to justify violence.Rousseau’s Theory of Pluralities – Bray refers to Rousseau’s idea that democracies require civic responsibility from at least 20% of people to function well—a power-law principle still relevant today.Further Reading / Related EpisodesEpisode 11: "Who Watches the Watchers? Privacy Law, AI, and Power"Episode 8: "The Great Disruption: Building Human-Centered Digital Futures"Episode 5: "Regenerating Social Fabric & Innovating Governance"Episode 4: "Dynamics of Digital Spaces: Rethinking Democracy Online"Call to ActionHow do we lead with courage and clarity when everything is changing? This conversation with Dr. David Bray offers a roadmap for leadership in uncertain times—grounded in systems thinking, public service, and a deep respect for human agency. 🎧 Listen now: Episode Link CreditsHost: Mary Camacho Guest: Dr. David Bray Produced by Terms of Service Podcast Sound Design: Arthur Vincent and SonorLab Co-Producers: Nicole Klau Ibarra & Mary Camacho

    46 min
  7. Who Watches the Watchers? Privacy Law, AI, and Power

    06/03/2025

    Who Watches the Watchers? Privacy Law, AI, and Power

    Episode SummaryIn this episode of Terms of Service, Mary Camacho sits down with William McGeveran—Dean of the University of Minnesota Law School and author of a leading privacy law casebook—to explore the evolving landscape of data protection, surveillance, and individual rights. With deep insights into both U.S. and European frameworks, McGeveran breaks down where current laws fall short, why consent alone doesn’t protect privacy, and how legal systems can (and should) evolve to meet the challenges posed by AI, big tech, and systemic data collection. Key TakeawaysMost of the world—including the EU—follows a “data protection” model that assumes personal data must be protected on behalf of individuals. This gives people broad rights to know, limit, and contest how their data is collected and used. In contrast, the U.S. lacks a unified data protection framework. Instead, companies are largely free to collect and use personal data unless a specific law prohibits it—prioritizing institutional autonomy over individual rights.Consent is an inadequate foundation for privacy protection. Relying on individuals to understand and agree to complex data practices shifts responsibility away from those in power and undermines meaningful control.Legal design matters. Structural choices—like creating intentional silos for data—can strengthen protections rather than limit innovation.Data breaches are no longer unusual—they’re inevitable. But legal standards still play a critical role in enforcing accountability and incentivizing better security practices.Younger generations see privacy not as a personal failure but as a systemic issue. And they're looking for collective, enforceable solutions—not just more terms of service.Topics Covered / Timestamped Sections01:39 – From Capitol Hill to privacy casebooks: McGeveran’s path into data law.02:48 – The wild west of the early internet and Lessig’s “Code”.04:32 – Silos in surveillance and the importance of intentional data separation.08:00 – Privacy law vs. data protection law: U.S. and EU’s contrasting assumptions.11:04 – Why California's privacy laws are stronger—but still fundamentally U.S. in approach.14:11 – Why it’s not “all over”: What legal protections still matter.17:33 – Aggregation harms and why individuals can’t foresee long-term data consequences.24:03 – How digital-native students view privacy today—and what gives them hope.27:00 – Why privacy policies can’t be read, and how AI can help interpret them.35:30 – GDPR’s global ripple effects and Max Schrems' legal victories.40:00 – Casebooks, case studies, and how law students are shaping future data policy.41:45 – Data breaches, legal gaps, and the human side of cybersecurity.50:35 – AI is both revolutionary and familiar—and requires caution, not panic.Guest Bio and LinksWilliam McGeveran – William McGeveran was named the twelfth dean of the University of Minnesota Law School in 2024. He originally joined the faculty of Minnesota Law in 2006 and previously served as the interim dean and the associate dean for academic affairs. Dean McGeveran’s research focuses on information law, with particular focus on data privacy and trademark law. His scholarship in trademark law considers the balance between prevention of harmful consumer confusion and protection of valuable speech including parody, commentary, and comparative advertising. McGeveran is also the sole author of a casebook, Privacy and Data Protection Law, used by instructors at dozens of U.S. law schools. Dean McGeveran has been a resident fellow at the University of Minnesota Institute of Advanced Study, a visiting professor at University College Dublin School of Law, and an instructor in the Notre Dame Law School London Programme. He frequently speaks to the media, submits amicus briefs, works with policymakers, and teaches continuing legal education courses in his specialty areas. Dean McGeveran earned a J.D., magna cum laude, from New York University and a B.A., magna cum laude, in political science from Carleton College. While an undergraduate he spent one year as a nonmatriculated visiting student at Worcester College, Oxford. Prior to joining Minnesota Law, he was a resident fellow at the Berkman Center for Internet and Society at Harvard Law School. He previously clerked for Judge Sandra Lynch on the United States Court of Appeals for the First Circuit and practiced as an intellectual property litigator at Foley Hoag LLP in Boston. Before law school, Dean McGeveran worked in national politics for seven years, primarily as a senior legislative aide to then-Rep. Charles Schumer. Follow William McGeveran on Linkedin Faculty Profile – University of Minnesota LawResources MentionedGDPR (General Data Protection Regulation) – Europe’s landmark data privacy lawCalifornia Consumer Privacy Act (CCPA) – A leading example of enhanced U.S. state-level regulation.Max Schrems and NOYB – Strategic litigation challenging EU-U.S. data sharing agreements.Carnegie Mellon - The Cost of Reading Privacy Policies Study – Analysis of time required to read all privacy policies.Privacy and Data Protection Law (University Casebook Series)Further Reading / Related EpisodesEpisode 1: "From AI Anxiety to IP Integrity: Navigating Rights in a Tech-Driven World"Episode 3: “Empowerment Tech: Unlocking Customer Data for Better Choices and Better Business”Call to ActionPrivacy isn't dead—but it is under pressure. If you're tired of shrugging at every “accept cookies” pop-up, this episode will help you rethink what’s possible through law, accountability, and systemic reform. Listen to Dean William McGeveran on how to reclaim digital dignity. 🎧 Listen now: Episode Link CreditsHost: Mary Camacho Guest: William McGeveran Produced by Terms of Service Podcast Sound Design: Arthur Vincent and Sonor Lab Co-Producers: Nicole Klau Ibarra & Mary Camacho

    54 min
  8. When Alexa Says Sorry: What We Risk When AI Sounds Human

    05/13/2025

    When Alexa Says Sorry: What We Risk When AI Sounds Human

    Episode SummaryIn this episode of Terms of Service, host Mary Camacho speaks with Marisa Zalabak, an AI ethicist and psychologist who explores how our relationships with artificial intelligence impact emotional intelligence, learning, communication, and mental health. With a rich background in education, social justice, psychology, and theater arts, Marisa offers deep insights into the emotional and ethical implications of anthropomorphizing AI, the risks of synthetic empathy, and the importance of slowing down to ask better questions. Together, they unpack how emotional and cognitive habits are being shaped by our daily interactions with machines—and what it means for our shared future. Key TakeawaysAnthropomorphizing AI—treating machines as if they are human—is natural but dangerous, especially when synthetic empathy (like chatbots saying “I’m sorry”) reinforces emotional trust in non-human systems.Marisa emphasizes the importance of asking better questions about the tools we use, why we use them, and what long-term effects they may have.Research shows people increasingly treat AI systems as coworkers or even confidants, which can affect trust, mental health, and social connection.Systems like Alexa and humanoid AIs often reinforce gender bias, particularly when defaulted to women’s voices.Encouraging digital literacy, slow learning, and psychological grounding helps individuals—and especially children—build healthy habits with technology.Topics Covered / Timestamped Sections01:55 – Marisa’s unconventional journey from performing arts to educational psychology to AI ethics05:48 – Discovering AI and contributing to one of the first IEEE standards on human well-being in AI design.08:27 – First deep AI encounter: conversing with NASA's humanoid BINA48 and the psychology of human-machine interaction.13:22 – Synthetic empathy and the blurry boundaries of trust in conversational AI.18:10 – How politeness and pronouns affect human habits and communication patterns.21:45 – Designing meaningful research on emotional and psychological effects of AI.23:14 – Children and AI: the real impacts of early and normalized interaction with synthetic personalities.38:00 – Why education should be an invitation to inquiry, not a race toward certainty.33:31 – Gendered AI voice assistants and their unintended social consequences.37:40 – Why education should be an invitation to inquiry, not a race toward certainty.42:05 – Breaking down complexity through “aunt Dorothy” explanations and slow, focused inquiry.Guest Bio and LinksMarisa Zalabak is an AI ethicist, psychologist, and thought leader specializing in responsible AI, education, sustainability, and human well-being. Her talks emphasize adaptive leadership, ethical innovation, and climate action through sustainable practices. A two-time TEDx and international keynote speaker, Marisa has contributed to global forums such as Stratcom, UN Summit of the Future, and AI House in Davos during the World Economic Forum. As Co-Founder of GADES (Global Alliance for Digital Education and Sustainability), Resident Fellow with The Digital Economist Center of Excellence and faculty member at the Trocadéro Forum Institute, Marisa champions education aligning responsible technology with regenerative design for human and planetary flourishing. Chairing IEEE's AI Ethics Education and Planet Positive 2030 initiatives, Marisa has co-authored ethical AI standards for human-wellbeing with AI technologies. Collaborating across sectors with organizations like Microsoft, SAP, and Stanford University Marisa addresses emerging issues in AI for a sustainable future. Marisa’s WebsiteMarisa’s LinkedinMarisa’s InstagramMarisa’s FacebookMarisa’s TEDx TalkIEEE Global Initiative on Ethics of Autonomous and Intelligent SystemsResources MentionedBINA48 – One of the first advanced humanoids trained for human interaction and space exploration.Synthetic Emotion in AI – IEEE working group focused on standards for AI that emulates human emotions.Digital Assistants & Bias – Ongoing research into how voice assistants perpetuate societal norms and stereotypes.Further Reading / Related EpisodesEpisode 5: “Regenerating Social Fabric & Innovating Governance”Call to ActionHow are your emotional habits being shaped by the tools you use every day? Marisa Zalabak invites us to slow down, ask better questions, and reimagine AI as a tool for well-being—not just productivity. Listen now and rethink the terms of service we accept in our digital lives. 🎧 Listen now: Episode Link CreditsHost: Mary Camacho Guest: Marisa Zalabak Produced by Terms of Service Podcast Sound Design: Arthur Vincent and Sonor Lab Co-Producers: Nicole Klau Ibarra & Mary Camacho

    47 min

Ratings & Reviews

5
out of 5
2 Ratings

About

Subscribe, rate, and share to support the show on Apple Podcasts, Spotify, PocketCast or wherever you listen. Follow us on LinkedIn for updates and join the conversation. Welcome to “Terms of Service,” the podcast that dives deep into the fine print of our digital lives. Every time we check the box on an app, website, or online service, we’re making choices—often without knowing the full story. From giving away our privacy to navigating complex security settings, we’re all part of a digital landscape that’s constantly evolving. Join us as we unpack the themes that shape our online experiences: privacy, security, safety, and the everyday permissions we grant without a second thought. We’ll explore how AI, agency, and decentralized technologies are reshaping our digital world, often in ways that fly under the radar. And because no conversation about our digital lives would be complete without it, we’ll tackle the legal and policy implications that come with our clicks, swipes, and taps. Whether you’re tech-savvy or just trying to keep up, “Terms of Service” invites you to join the conversation about the hidden costs of convenience in the digital age. Tune in to explore, question, and rethink the terms we often accept without a second thought, and let’s challenge the norms of our digital lives together. CreditsProduced by Mary Camacho & Nicole Klau Ibarra. Music and sound production is by Arthur Vincent at Sonorlab. Behind the Mic Co-founder of Holochain, and CEO of Holo, Mary leads the development of peer-to-peer and decentralised technologies that empower users and redefine digital interactions. With over 20 years in tech and telecom, her career has been dedicated to enhancing user control, privacy, and digital autonomy. Mary's educational background as a social scientist grounds the explorations at intersection of sociality and technology, exposing the trade-offs in privacy, security, and agency inherent in our digital choices. On “Terms of Service,” she invites listeners to rethink these everyday interactions and the broader implications of AI, distributed tech, and legal frameworks on our digital lives, advocating for a future where individuals have greater control over their data and decisions. https://www.linkedin.com/in/maryfcamacho/ Nicole is a visionary entrepreneur with a diverse background, she is passionate about social system design and has helped multiple ventures, Including the IKIGAI Project, her non-profit helping people build essential 21st-century skills- many of which intersect with the topics discussed in Terms Of Service. https://www.linkedin.com/in/nicole-klau-ibarra-b26818137/ Arthur Vincent is a seasoned Music and Audio Producer with a passion for pushing the boundaries of music technology. As a music producer and sound designer, he has crafted innovative audio experiences for global brands like Heineken, Philips, and Cupra. Alongside his creative work, Arthur is also an expert in audio technology, mastering both hardware and software tools to deliver high-quality, immersive sound. https://www.linkedin.com/in/arthur-vincent/