The Tech Against Terrorism podcast takes a deep dive into terrorist and far-right abuse of the internet. Whether it’s radicalising people through posts on the decentralized web, or uploading manifestos and attack videos to social media, extremist activity online is on the rise. The Tech Against Terrorism team is committed to supporting the tech industry in their fight against terrorist exploitation of the internet, whilst respecting human rights. In this podcast they speak to various experts in the field to find out how these insidious messages are being spread – and what we’re doing to stop them.
Incels, online misogyny and gender-based terrorism
Incels are members of an online subculture who define themselves as unable to find a romantic or sexual partner despite desiring one and can be characterised by their hatred towards women. Over the last 10 years, attacks claimed by individuals propagating an incel ideology have claimed the lives of almost 50 people, with an average of 8 per attack. Whilst incel attacks often attract a great deal of attention, incels’ concentrate a majority of their activity online, in which they interact with other misogynistic communities in the so-called online “manosphere”. In this episode, we discuss the roots of the incel movement and the contradictions baked into misogynistic incel theories and manifestos. From how the self-deprecation found in incel forums masks the male supremacist ideology, to how ‘Chads’ (the name given to men perceived as genetically attractive) are to be both admired and attacked. We also explore how the conceptualisation of incels, whether as hate speech, violent extremism or even in some cases, as terrorism, affects tech companies online regulation as well as wider counter-terrorism policies.
Anne Craanen and Jacob Berntsson discuss the nuances of misogynist incel ideology. They are joined by two of the most forefront voices in this space: Dr. Debbie Ging, an associate professor in the School of Communications at Dublin City University, where her research is focussed on digital hate, online anti-feminist men’s rights organisations and the incel phenomenon; and Alex DiBranco, the co-founder and executive director of the Institute for Research on Male Supremacism whose research is focussed on the development of right-wing and contemporary misogynist movements. Together, they consider what measures technology companies can take to deal with incel groups - such as partnering with entities that have expertise in countering these forms of extremism. They argue how incels and wider misogyny are a problem both offline and online, and how countering these issues requires collective action from both spheres. They also highlight the importance of the educational level, particularly through encouraging progressive sex education as well as lessons in media literacy and digital ethics. Finally, both agree that some forms of incel violence should be seen as gender-based terrorism.
Trend alert: accelerationism
In this episode, we discuss why accelerationism has become a flagship doctrine of far-right violent extremism. To help us comprehend what accelerationism is and how it is reflected in the online sphere, Maygane Janin and Adam Hadley are joined by Professor Matthew Feldmann, Director of the Centre for Analysis of the Radical Right (CARR), and an expert on fascist ideology, neo-Nazism and “lone actor” terrorism, and by Ashton Kingdon, a PhD student at the University of Southampton and a fellow at CARR, whose research focuses on how far-right extremists use technology for recruitment and radicalisation. In today’s podcast, we also welcome Ben Makuch, a national security reporter with Vice News, who investigates far-right violent extremism, particularly neo-Nazism.
Together, they consider how propaganda is being repurposed on forums and mainstream platforms to coincide with particular events to misconstrue the narrative and cause political tension. We also discuss the emergence of accelerationist subcultures, and how they are using the pandemic to “initiate the collapse of society”, and discuss a rise in media attention on accelerationism in the US.
A Gender Approach to Women's Role in the Online Extremist Sphere
Across the ideological spectrum, there are misconceptions and oversimplifications when it comes to discussing the role of women in terrorist organisations. From the perception that women are groomed into joining violent extremist groups and can therefore be presumed innocent, to the notion that a woman’s role in a terrorist organisation is secondary simply because she is less likely to be the one picking up a weapon to carry out an attack. In this episode, we debunk many of these myths and explain why this issue has far more depth to it than the media conveys. We explore the misleading ‘jihadi bride’ stories perpetuated by the media, we examine women’s roles in online propaganda and recruitment, and we discuss the nuances to the “push and pull” factors of why women join terrorist groups - including far right groups. Drawing upon all of this, we provide recommendations on how the tech sector should counter women's role in online extremism and terrorism.
Maygane Janin and Anne Craanen discuss the complexities at the intersection of gender and terrorism. They are joined by two of the foremost voices in this space: Dr. Joana Cook, an Assistant Professor on Terrorism and Political Violence at Leiden University, Senior Project Manager and an Editor in Chief at the International Centre for Counterterrorism who recently published a book on gender and counterterrorism titled “A Woman’s Place: U.S. counterterrorism since 9/11”; and Dr. Elisabeth Pearson, a lecturer at the Cyber Threats Research Centre at Swansea University who specialises in gender, extremism, and counter extremism. Together, they consider the broader socio-cultural context of how gender is viewed in extremist ideology participation - especially with regards to how understanding of gender identity, individuals’ experiences, age, and social class also impact the reasons someone might join an extremist group.
Regulating the online sphere
How to best regulate the online sphere will be amongst the most important topics of the upcoming decade. Up until recently, laws have been in place that serve to mostly shield digital intermediaries from liability for third-party illegal content on their platform. Since 2016 however, in response to mounting concerns over the criminal misuse of the internet and a surge in noxious content online, the regulatory landscape has begun to change. Governments around the world have started to impose laws and regulatory frameworks that oblige online platforms to expediently and proactively address illegal or harmful content on their sites. Increasingly, however, platforms have also developed their own modes of self-regulation, endeavouring to incorporate new structures of responsibility and accountability into their business models.
Join Flora Deverell and Jacob Berntson as they discuss the ways in which online regulation is being pursued by companies, governments, and multi-lateral organisations, such as with the upcoming EU wide law on the dissemination of terrorist content. They are joined by two of the foremost voices in this space: Evelyn Douek, a lecturer in law and SJD candidate at Harvard Law School, and affiliate at the Berkman Klein Center for Internet & Society, studying international and transnational regulation of online speech; and Daphne Keller, Director of Platform Regulation at Stanford’s Cyber Policy Center – formerly Assistant General Counsel at Google and Director of Intermediary Liability at Stanford’s Center for Internet and Society – who has worked on groundbreaking Intermediary Liability litigation and legislation around the world. They also explore the implications of Facebook’s new Oversight Board and what this really means for governance and accountability processes, whether we should use international human rights law as a framework for ruling the internet, and why terrorist content is such an important topic in regulatory discourse.
Full list of resources on our website: https://www.techagainstterrorism.fm/regulating-the-online-sphere/
How are terrorists and violent extremists using gamification?
“You can sit at home and play Call of Duty or you can come and respond to the real Call of Duty…the choice is yours.” This was tweeted by a well-known ISIS hacker and propagandist. Gaming culture and popular video games, such as Call of Duty and World of Warcraft, have become exploited by terrorist and violent extremist actors for propaganda and radicalisation purposes.
Join Maygane Janin and Flora Deverell as they discuss how terrorist and violent extremists exploit gaming culture for their own ends. They are joined by Linda Schlegel, a senior editor at The Counterterrorism Group and a regular contributor for the European Eye on Radicalization, where she recently published a number of articles on the exploitation of gaming culture; and Dr. Nick Robinson, an associate professor in politics and international studies at the University of Leeds who has been researching the links between videogames, social media, militarism, and terrorism for over a decade. They address in particular the “gamification of radicalisation,” the exploitation of gaming platforms, as well as why terrorist organisations developing their own games to serve their own ideologies and purposes is less prevalent now than it used to be.
Full list of resources on our website: https://www.techagainstterrorism.fm/how-are-terrorists-and-violent-extremists-using-gamification/
Linda Schlegel (@LiSchlegel)
Dr. Nick Robinson
Far-right violent extremists and meme culture
During the recent protests against the coronavirus lockdown in the US, a protester was spotted with a flyer referring to “Boogaloo”, a popular far-right violent extremist slang term calling for a new civil war that has turned into a meme culture of its own amongst violent extremists. One year earlier, before attacking two mosques and killing 51 people, the Christchurch shooter posted on messaging board 8chan, encouraging readers to continue to make memes.
Join Maygane Janin and Jacob Berntsson as they discuss how memes have become an unconventional strategy for violent extremists to easily spread their ideologies. They are joined by Maik Fielitz, a researcher at the Jena Institute for Democracy and Civil Society, and a fellow at the Centre of Analysis of the Radical Right specialising in far-right extremism in Germany; and Lisa Bogerts, an expert of visual communication, both of them are contributors to the 2019 book, ‘Post-Digital Cultures of the Far Right’. They discuss how far-right violent extremists take advantage of the intrinsic virality of seemingly harmless online jokes to reach out to new audiences and penetrate mainstream culture.
The visual culture of far-right terrorism (Bogerts & Fielitz, 2020) “Do you want meme war” (Bogerts & Fielitz, 2018) The visual culture of far-right terrorism (LBogerts & Fielitz, 2020)Digital fascism: challenges for the open society in times of social media (Fielietz, Marcks, 2019) How memes are becoming the new frontier of information warfare (Ascott, 2020)Cyber swarming, memetic warfare and viral insurgency (Goldenberg, Finkelstein, 2020) We Analyzed How the "Great Replacement" and Far Right Ideas Spread Online. The Trends Reveal Deep Concerns (Ebner & Davey, 2019)The far-right is weaponizing Instagram to recruit Gen Z (Bateman, 2019)How the radical right weaponizes memes? (Liyanage, 2020) Meme warfare in the Swedish context (Davey, 2018)Full list of resources on our website: https://www.techagainstterrorism.fm/far-right-violent-extremists-and-meme-culture/
Maik Fielitz (@maik_fielitz)