“In Reality” debunks fake news and elevates the innovative researchers, entrepreneurs, journalists and policymakers who are fighting back against toxic misinformation. Co-hosts Joan Donovan, research director of the Harvard Kennedy School’s Shorenstein Center on Media and Public Policy, and Eric Schurenberg, an award-winning journalist and former CEO of Fast Company, engage guests in enlightening conversations about solutions to this scourge and the path back to a shared reality.
What if Polarization is Actually an Illusion?
If you are a Democrat, have you ever espoused the slogan “Defund the Police?” If you’re a Republican, do you agree with politicians who claim that 2020 presidential election was stolen? If you said yes, you may well be operating under a “collective illusion,” a widespread mental phenomenon in which people take positions in public they privately don’t actually believe, because they think that everyone else in their group does believe it. The implications for the spread of disinformation these days are obvious. In this episode of In Reality, host Eric Schurenberg talks with Todd Rose, co-founder of the think tank Populace and the author of a fascinating book called ‘Collective Illusions.’ The conversation covers a mind-boggling range of common public beliefs that almost no one privately believes (who knew?). Todd also explains why it’s so important for your own mental health and the health of democracy to speak your own authentic truth – and how to do that without getting yourself shunned by your in-group.
Dismantling the Disinformation Economy
Check My Ads Institute is an organization that is taking aim at purveyors of conspiracy theories, hate speech and disinformation. The Institute describes itself as “an independent watchdog” whose goal is to prevent digital advertisers from inadvertently monetizing the spread of falsehoods.
In this episode of In Reality, host Eric Schurenberg sits down with the co-founder of Check My Ads Institute, Claire Atkin, to unpack how the digital advertising industry works to support disinformation and perpetuate ad fraud despite its claims to do the opposite. Claire delves into programmatic advertising and explains how third-party ad-serving companies keep brands unaware of where their digital ads are being placed, allowing propagandists to earn revenue from advertisers who would never intentionally support them. Finally, she specifies the steps that Check My Ads Institute is taking to hold the digital ad industry to account, as well as who the company is targeting next.
How Cambridge Analytica Opened the Pandora’s Box of Disinformation
In this episode of In Reality, Eric Schurenberg hosts Brittany Kaiser, best known as one of the whistleblowers at Cambridge Analytica, the British political consulting firm that worked on the disinformation-laden 2016 campaigns behind Brexit and the election of Donald Trump. Having disavowed her former employer, she is now a much sought-after expert on data privacy, blockchain technology, and legislative reform meant to counter disinformation campaigns.
Much of the conversation focuses on Brittany’s tenure as director of business development at Cambridge Analytica. Brittany describes the firm’s techniques of creating psychological profiles of voters and then micro-targeting false or misleading messages to them. She explains how her former employer’s voter suppression strategies were categorically different–morally, legally and tactically–from commercial targeted advertising campaigns.
Finally, they delve into Brittany’s Own Your Data Foundation, a not-for-profit dedicated to raising the DQ (Digital Intelligence) of lawmakers, students, parents and voters and minimizing the existential risks of fake news, cyber attacks, disinformation and polarization–the demons that Cambridge Analytica helped unleash to the detriment of democracy in 2016.
Combatting Domestic Terrorism with Melanie Smith
In this episode of In Reality, recorded at the Collision conference in Toronto, host Eric Schurenberg joins Melanie Smith, Head of the Digital Analysis Unit at the London-based Institute for Strategic Dialogue–an independent non-profit dedicated to reversing the tide of polarization, extremism, and disinformation worldwide.
The topics in this episode: how the threat of radicalized violence has shifted from foreign actors to domestic ones; why (at least before January 6th) it was so difficult to convince policymakers that domestic extremism was the more serious threat; how domestic extremists prey on the same set of human insecurities to radicalize their targets as Islamic extremists; why Instagram is a favorite tool of disinformation promoters and Pinterest isn’t; and which demographic groups are most likely to spread harmful false information unwittingly.
From Smith: “I am optimistic that we can contain disinformation over a 10-year time frame, but I am concerned that things will get worse in the next five years. Elections tend to inflame disinformation, and that, in some places, can easily lead to violence. You have to realize that there are interests that want to seize the opportunity to deepen the divisions in our society.”
The elite’s blind spots and the illusion of truth with Gillian Tett
In this episode of In Reality, host Eric Schurenberg sits down with Gillian Tett, Chair of the Editorial Board and Editor-at-Large for the Financial Times, US. Gillian is also trained as an anthropologist, which gives her a unique perspective on the tribal divides within American society. If you believe that your grasp of reality is the only legitimate one, prepare to be challenged.
Anthropologists, Gillian explains, view sub-cultures as self-contained. The belief in conspiracies may seem incomprehensible to most In Reality listeners, but it makes sense to groups who feel abandoned and belittled by elites. All of us have trouble seeing our biases as anything other than ground truths. For example, elites in media, government, entertainment, academe, and so on, regard command of language as an indisputable sign of seriousness and status. For other tribes in America, articulateness is irrelevant. What matters instead is loyal adherence to the tribe’s fears and grievances. For members of those groups, the facts presented by institutions like the media and legal system are suspect on their face. The only information that is really trustworthy is what’s conveyed by other members of the tribe.
Gillian and Eric take the anthropologist’s view of a wide range of contemporary news events: Why the best way to understand Trump supporters is to attend professional wrestling; what Trump’s use of the neologism “bigly” reveals about professional media’s blind spots; and why whistleblowers are disproportionately women. Listen, and prepare to confront your own blind spots.
On the front lines of the disinformation fight with Áine Kerr
In the fight against disinformation, the last line of defense between audiences and malicious falsehoods are the “trust and safety” teams, also known as content moderators. Some of them are employed by social media platforms like Facebook and Spotify, but increasingly the platforms outsource the work of identifying and countering dangerous lies to fact-checking organizations like the fast-growing Irish company, Kinzen.
In this episode of In Reality, host Eric Schurenberg sits down with Áine Kerr, co-Founder, and COO of Kinzen. Áine is a serial risk-taker with extensive experience in the intersection of journalism and technology, most recently as the global head of journalism partnerships at Facebook.
Kinzen helps platforms, policymakers, and other defenders “get ahead and stay ahead” of false and hateful content in video, podcast, and text platforms. The company uses artificial intelligence to sniff out objectionable content and then when needed, invites human readers to judge for context and nuance. What Kinzen calls “human in the loop technology” minimizes errors while still allowing for fact-checking at social media scale.
In the recent Brazilian elections, for example, Áine explains that disinformation actors came to realize that phrases like “election fraud” and “rigged election” were alerting content moderators who could take down their false claims. So, the actors began substituting seemingly innocuous phrases like “we are campaigning for clean elections.” Kinzen’s human moderators spotted the changes and helped authorities intercept the false messages.
Áine and Eric also dive into the many reasons that someone may participate in sharing harmful content online, ranging from sheer amoral greed to ideological commitment. She ends with a warning that the spreaders of disinformation currently have the upper hand. It is always easier to spread lies than to counteract them. The allies of truth–researchers, social media platforms, entrepreneurs, and fact-checking organizations like hers–need to get better at coordinating their efforts to fight back, or democracy will remain an existential risk around the world.