In this podcast from the Center for Humane Technology, co-hosts Tristan Harris and Aza Raskin expose how social media’s race for attention manipulates our choices, breaks down truth, and destabilizes our real-world communities.
Tristan and Aza also explore solutions: what it means to become sophisticated about human nature by interviewing anthropologists, researchers, cultural and faith-based leaders, activists, and experts on everything from conspiracy theories to existential global threats.
Mr. Harris Zooms to Washington
Back in January 2020, Tristan Harris went to Washington, D.C. to testify before the U.S. Congress on the harms of social media. A few weeks ago, he returned — virtually — for another hearing, Algorithms and Amplification: How Social Media Platforms’ Design Choices Shape Our Discourse and Our Minds. He testified alongside Dr. Joan Donovan, Research Director at the Harvard Kennedy School’s Shorenstein Center on Media Politics and Public Policy and the heads of policy from Facebook, YouTube and Twitter. The senators’ animated questioning demonstrated a deeper understanding of how these companies’ fundamental business models and design properties fuel hate and misinformation, and many of the lawmakers expressed a desire and willingness to take regulatory action. But, there’s still room for a more focused conversation. “It’s not about whether they filter out bad content,” says Tristan, “but really whether the entire business model of capturing human performance is a good way to organize society.” In this episode, a follow-up to last year’s “Mr. Harris Goes to Washington,” Tristan and Aza Raskin debrief about what was different this time, and what work lies ahead to pave the way for effective policy.
Can Your Reality Turn on a Word?
Can hypnosis be a tool to help us see how our minds are being shaped and manipulated more than we realize? Guest Anthony Jacquin is a hypnotist and hypnotherapist of over 20 years, author of "Reality is Plastic," and he co-runs the Jacquin Hypnosis Academy. He uses his practice to help his clients change their behavior and improve their lives. In this episode, he breaks down the misconceptions of hypnosis and reveals that despite the influence of hypnotizing forces like social media, we all still have the ability to get in touch with our subconscious selves. “What can I say with certainty is true about me — what is good, true and real about me?” Anthony asks. “Much of what we’ve invested in is actually transient. It will change. What is unchanging?” Anthony draws connections between hypnosis and technology and the impacts of both on our subconscious minds but identifies a key difference — technology is exploiting us. But maybe a little more insight into one more dimension of how our minds work underneath the hood can help us build better, more humane and conscious technology.
The Stubborn Optimist's Guide Revisited
[This episode originally aired May 21, 2020] Internationally-recognized global leader on climate change Christiana Figueres argues that the battle against global threats like climate change begins in our own heads. She became the United Nations’ top climate official, after she had watched the 2009 Copenhagen climate summit collapse “in blood, in screams, in tears.” In the wake of that debacle, Christiana began performing an act of emotional Aikido on herself, her team, and eventually delegates from 196 nations. She called it “stubborn optimism.” It requires a clear and alluring vision of a future that can supplant the dystopian and discouraging vision of what will happen if the world fails to act. It was stubborn optimism, she says, that convinced those nations to sign the first global climate framework, the Paris Agreement. In this episode, we explore how a similar shift in Silicon Valley’s vision could lead 3 billion people to take action for the planet.
Mind the (Perception) Gap
What do you think the other side thinks? Guest Dan Vallone is the Director of More in Common U.S.A., an organization that’s been asking Democrats and Republicans that critical question. Their work has uncovered countless “perception gaps” in our understanding of each other. For example, Democrats think that about 30 percent of Republicans support "reasonable gun control," but in reality, it’s about 70 percent. Both Republicans and Democrats think that about 50 percent of the other side would feel that physical violence is justified in some situations, but the actual number for each is only about five percent. “Both sides are convinced that the majority of their political opponents are extremists,” says Dan. “And yet, that's just not true.” Social media encourages the most extreme views to speak the loudest and rise to the top—and it’s hard to start a conversation and work together when we’re all arguing with mirages. But Dan’s insights and the work of More in Common provide a hopeful guide to unraveling the distortions we’ve come to accept and correcting our foggy vision.
The film Coded Bias follows MIT Media Lab researcher Joy Buolamwini through her investigation of algorithmic discrimination, after she accidentally discovers that facial recognition technologies do not detect darker-skinned faces. Joy is joined on screen by experts in the field, researchers, activists, and involuntary victims of algorithmic injustice. Coded Bias was released on Netflix April 5, 2021, premiered at the Sundance Film Festival last year, and has been called “‘An Inconvenient Truth’ for Big Tech algorithms” by Fast Company magazine. We talk to director Shalini Kantayya about the impetus for the film and how to tackle the threats these challenges pose to civil rights while working towards more humane technology for all.
Come Together Right Now
How many technologists have traveled to Niger, or the Balkans, or Rwanda, to learn the lessons of peacebuilding? Technology and social media are creating patterns and pathways of conflict that few people anticipated or even imagined just a decade ago. And we need to act quickly to contain the effects, but we don't have to reinvent the wheel. There are people, such as this episode’s guest, Shamil Idriss, CEO of the organization Search for Common Ground, who have been training for years to understand human beings and learn how to help them connect and begin healing processes. These experts can share their insights and help us figure out how to apply them to our new digital habitats. “Peace moves at the speed of trust, and trust can’t be fast-tracked,” says Shamil. Real change is possible, but as he explains, it takes patience, care, and creativity to get there.
Amazing podcast!! Can’t recommend it enough! Every day I send a link to another friend or family member. Thank you for your important work!