6 episodes

Bad Faith Cycles in Algorithmic Cultivation is an interview-based podcast series that explores Identity, defined as who we are and what we do, and Agency, defined as the sum total range of potential actions or our ability to make a difference, in our contemporary digitally and algorithmically mediated lives.

The goal of the podcast is to provide listeners with an understanding of surveillance in digital space that allows them to recognize issues of “exploitation, commodification, and degradation” as related to commercial data extraction.

Bad Faith Cycles in Algorithmic Cultivation Calvin H

    • Society & Culture

Bad Faith Cycles in Algorithmic Cultivation is an interview-based podcast series that explores Identity, defined as who we are and what we do, and Agency, defined as the sum total range of potential actions or our ability to make a difference, in our contemporary digitally and algorithmically mediated lives.

The goal of the podcast is to provide listeners with an understanding of surveillance in digital space that allows them to recognize issues of “exploitation, commodification, and degradation” as related to commercial data extraction.

    Cory Doctorow

    Cory Doctorow

    Cory Doctorow is a digital rights activist, a podcaster, and a writer. Cory speaks with conviction against commercial data practices, which he views as opaque and untrustworthy. Cory recently wrote an article on consent theatre, [1] a concept that explains the strategies used by data-based companies to obfuscate the depth of their surveillance practices to acquire unwitting consent of their users.

    [1] Cory Doctorow, “Consent Theater,” Medium, 2021, https://onezero.medium.com/consent-theater-a32b98cd8d96.

    • 57 min
    Dr Darin Barney

    Dr Darin Barney

    Dr Darin Barney is a professor at McGill University in Montreal, Canada. His work examines the future of digital technologies in democratic life, [1] the state of citizenship is a digitally integrated society, [2] and the infrastructure of network societies. [3] Our discussion revolved around concerns of digital governance over social and political life, [4] algorithmic fragmentation of social reality, [5] and the commercialization of data as treating users as standing-reserve. [6]

    [1] Darin David Barney, Prometheus Wired: The Hope for Democracy in the Age of Network Technology (Vancouver: UBC Press, 2000).

    [2] Darin David Barney, One Nation under Google: Citizenship in the Technological Republic (Toronto: Hart House Lecture Committee, 2007).

    [3] Darin David Barney, The Network Society, Key Concepts (Cambridge, UK: Polity, 2010).

    [4] Yu-Che Chen, Managing Digital Governance: Issues, Challenges, and Solutions.(Boca Raton: Taylor and Francis, 2017), https://public.ebookcentral.proquest.com/choice/publicfullrecord.aspx?p=4921790; Just and Latzer, 245.

    [5] Dean DeChiaro, “Social Media Algorithms Threaten Democracy, Experts Tell Senators,” Roll Call, April 21, 2021, https://www.rollcall.com/2021/04/27/social-media-algorithms-threaten-democracy-experts-tell-senators/; Susan Morgan, “Fake News, Disinformation, Manipulation and Online Tactics to Undermine Democracy,” Journal of Cyber Policy 3, no. 1 (January 2, 2018): 39–43, https://doi.org/10.1080/23738871.2018.1462395; Ünver, 127–46.

    [6] Martin Heidegger and William Lovitt, The Question Concerning Technology and Other Essays (New York: Harper & Row, 1977).

    • 38 min
    Dr John Cheney-Lippold

    Dr John Cheney-Lippold

    Dr John Cheney-Lippold is an assistant professor at the University of Michigan, Ann Arbor, USA. His work uses a variety of philosophical concepts to provide an ontological review of the intersections between commercial and domestic surveillance, identity profiling, cultural participation, and the processes of becoming. [1] Dr Cheney-Lippold’s concept Algorithmic Identity illustrates how the intensity of identity profiling in commercial surveillance practices curates an identity based sense of reality for digital technology users. [2] Dr Cheney-Lippold reflects this new mode of media distribution that utilizes data to target identity categories deserves significant ontological considerations.

    [1] John Cheney-Lippold, We Are Data: Algorithms and the Making of Our Digital Selves(New York: New York University Press, 2017).

    [2] John Cheney-Lippold, 5; Natascha Just and Michael Latzer, “Governance by Algorithms: Reality Construction by Algorithmic Selection on the Internet,” Media, Culture & Society 39, no. 2 (March 2017): 238–58, https://doi.org/10.1177/0163443716643157; Smith, “On You: Networks, Subjectivity and Algorithmic Identity, 2018; Cornelius Schubert, “The social life of computer simulations: On the social construction of algorithms and the algorithmic construction of the social,” in Simulieren und Entscheiden, ed. Nicole J. Saam, Michael Resch, and Andreas Kaminski, Sozialwissenschaftliche Simulationen und die Soziologie der Simulation (Wiesbaden: Springer Fachmedien Wiesbaden, 2019), 145–69, https://doi.org/10.1007/978-3-658-26042-2_6.

    • 24 min
    Hannah Mieczkowski

    Hannah Mieczkowski

    Hannah Mieczkowski a PhD candidate in Psychology at the University of Stanford, USA. Hannah’s research epistemologically challenge knowledge claims made by social media researchers.[1] Hannah suggests that problematic smartphone use may be symptomatic of pre-existing mental health conditions that an individual pacifies through smartphone use.

    [1] Hannah Mieczkowski, Angela Lee, and Jeffrey Hancock, “Priming Effects of Social Media Use Scales on Well-Being Outcomes: The Influence of Intensity and Addiction Scales on Self-Reported Depression,” November 25, 2020, 10.1177/2056305120961784.

    • 17 min
    Dr Alan Sears

    Dr Alan Sears

    Dr Alan Sears is a Professor of Sociology at X University, Toronto, Canada. He can speak with authority on sociological concepts that aid in illustrating the Cycles in Algorithmic Cultivation concept, such as agency and symbolic interactionism. Both of these concepts are important in aiding considerations about algorithmic cultivation.

    Symbolic interactionism is a theoretical framework that explains how cultures are formed through groups of values and behaviours that constitute a symbolic world. [1] Symbolic interactionism is also concerned with how the range of symbols within a culture demands performative behaviours for cultural inclusion, and how these demands influence the behavior of individuals. [2] Symbolic interactionism can be a useful tool for highlighting the vitally structural characteristics of a culture and the range of behaviours that mandate inclusion.

    [1]Peter M. Hall, “Symbolic Interaction,” in The Blackwell Encyclopedia of Sociology, ed. George Ritzer (Oxford, UK: John Wiley & Sons, Ltd, 2016), 1–5, https://doi.org/10.1002/9781405165518.wbeoss310.pub2.

    [2]Richard L. West and Lynn H. Turner, Introducing Communication Theory: Analysis and Application, Sixth edition (New York, NY: McGraw-Hill Education, 2018).

    • 35 min
    The Introduction Episode

    The Introduction Episode

    Bad Faith Cycles in Algorithmic Cultivation, an interview-based podcast series that explores identity, [1] defined as who we are and what we do, [2] and agency, defined as the sum total range of potential actions or our ability to make a difference, [3] in our contemporary digitally and algorithmically mediated lives.

    The datafication of consumers, and subsequent tailoring of a digital experience, has developed digital echo chambers. Echo chambers refer to situations where an individual is repeatedly exposed to the same perspective of knowledge. [4]

    I apply the media effects model of cultivation theory to digital echo chambers prompted through personalized content algorithms to explore the ways that data and identity-based content targeting may impact an individual’s agency through pre-determining the digital content that influence their framing of reality and beliefs. This process, Algorithmic Cultivation, raises important considerations about who we are and how we come to be in data-based systems.

    Personalized content algorithms operate in a cyclical manner to provide users targeted content. They start with identity factors and then draw from previous browsing habits to predict behaviour and make content suggestions. Every interaction or lack of interaction with any piece of suggested content provides feedback to a personalized content algorithm, which then uses this new information to further suggest content, and the cycle goes on and on. French Philosopher Gilles Deleuze’s concept of the Control Society is useful for exploring the contemporary digital networks and data systems that are attributed a form of power over social life. [5] The Control Society tells us that surveillance and data systems self-regulate the power dynamics of social life through algorithmic profiling of identities that maintain the patterns in a society that individual identities are subject to. [6] Through this lens, it should be understood that personalized content algorithms are an expression of the commercial interests of datafication, and although these are distinct from algorithms used in civic or political life such as sentencing algorithms, personalized content algorithms also have important implications onto the regulations of social life.

    [1] Anthony Giddens, Modernity and Self-Identity: Self and Society in the Late Modern Age, 1. publ. in the U.S.A (Stanford, Calif: Stanford Univ. Press, 1997).

    [2] Jonathan M. Cheek, “Identity Orientations and Self-Interpretation,” in Personality Psychology, ed. David M. Buss and Nancy Cantor (New York, NY: Springer US, 1989), 275–85, https://doi.org/10.1007/978-1-4684-0634-4_21.

    [3] Bruno Latour, Reassembling the Social: An Introduction to Actor-Network Theory., 2005.

    [4] Pablo Barberá et al., “Tweeting From Left to Right: Is Online Political Communication More Than an Echo Chamber?,” Psychological Science 26, no. 10 (October 2015): 1531–42, https://doi.org/10.1177/0956797615594620.

    [5] Gilles Deleuze, “Postscript on the Societies of Control,” October 59 (1992): 3–7, http://www.jstor.org/stable/778828.

    [6] Kurt Iveson and Sophia Maalsen, “Social Control in the Networked City: Datafied Dividuals, Disciplined Individuals and Powers of Assembly,” Environment and Planning D: Society and Space 37, no. 2 (April 2019): 331–49, https://doi.org/10.1177/0263775818812084.

    • 19 min

Top Podcasts In Society & Culture

Stuff You Should Know
iHeartPodcasts
Modern Wisdom
Chris Williamson
The Happiness Lab with Dr. Laurie Santos
Pushkin Industries
Philosophize This!
Stephen West
Freakonomics Radio
Freakonomics Radio + Stitcher
Date Yourself Instead
Lyss Boss