Death by Algorithm

Sune With
Death by Algorithm

A series on autonomous weapons systems, drones and AI in the military domain. Experts from various disciplines share their research and discuss the black box, responsibility, human-machine interaction, and the future of legal and ethical frameworks for AI in war. How is war regulated? Can the ethics of war be programmed into machines? Does it change how we fight? Can war be cleaned up by technology? How can soldiers understand the systems? Will AI systems be the commanders of tomorrow? Why not just let the robots fight? Episodes are narration and interviews and not chronological

Episodes

  1. From Practice to (Auto)Norms feat. Ingvild Bode

    MAY 20

    From Practice to (Auto)Norms feat. Ingvild Bode

    Practices create norms, and words can shape reality. This applies to the debate on autonomous weapons and AI in the military domain. It is significant whether the technology precedes public deliberation and regulations. Additionally, it is crucial whether we refer to AI in the military as "decision support systems", "emerging technology", "autonomous weapons", or "killer robots".  Professor and recipient of the Danish Elite Research Prize 2025, Ingvild Bode describes her research project, AutoNorms, and how it tracks discourse and development on autonomous weapons. Furthermore, Ingvild shares her perspectives on the black box, meaningful human control, ethical machines, and the future of regulation. Shownotes: Producer and host: Sune With sunewith@cas.au.dk Coverart: Sebastian Gram References and literature   -       AutoNorms, PI Ingvild Bode (Accessed May 13. 2025) https://www.autonorms.eu/   -       Arai, Koki; Matsumoto, Masakazu, 2023, ”Public perception of autonomous lethal weapons systems”, AI and Ethics (2024) 4:451–462 https://link.springer.com/article/10.1007/s43681-023-00282-9   -       Bode, Ingvild. 2024. "Emergent Normativity: Communities of Practice, Technology, and Lethal Autonomous Weapons Systems". Global Studies Quarterly 4(1), https://doi.org/10.1093/isagsq/ksad073   -       Bode, Ingvild.; Nadibaidze, Anna, 2024, ”Autonomous Drones”. In J. P. Rogers (Ed.), De Gruyter Handbook on Drone Warfare, pp. 369-384, De Gruyter.     -       Bode, Ingvild; Bhila, Ishmael; September 3. 2024, ”The problem of algorithmic bias in AI-based military decision support”, Humanitarian Law and Policy, ICRC. https://blogs.icrc.org/law-and-policy/2024/09/03/the-problem-of-algorithmic-bias-in-ai-based-military-decision-support-systems/     -       Bode, Ingvild. 2023. “Practice-Based and Public-Deliberative Normativity: Retaining Human Control over the Use of Force.” European Journal of International Relations 29(4), 990-1016, https://doi.org/10.1177/13540661231163392   -       Bode, Ingvild, and Tom Watts. 2023. Loitering Munitions and Unpredictability: Autonomy in Weapon Systems and Challenges to Human Control. Odense, London: SDU Center for War Studies, Royal Holloway Centre for International Security. Link    -       Campaign to Stop Killer Robots, 2021, “Killer Robots: Survey Shows Opposition Remains Strong”, Humans Rights Watch (Accessed May 14. 2025)   https://www.hrw.org/news/2021/02/02/killer-robots-survey-shows-opposition-remains-strong   -       Deeney, Chris, 2019, “Six in Ten (61%) Respondents Across 26 Countries Oppose the Use of Lethal Autonomous Weapons Systems”, Ipsos (Accessed May 14. 2025).   https://www.ipsos.com/en-us/news-polls/human-rights-watch-six-in-ten-oppose-autonomous-weapons   -       HuMach, PI Ingvild Bode. (Accessed May 13. 2025) https://www.sdu.dk/en/forskning/forskningsenheder/samf/cws/cws-activities/projects/humach   -       IEEE Standart Association, A Research Group on Issues of Autonomy and AI in Defense Systems. (2024). ”A Framework for Human Decision Making Through the Lifecycle of Autonomous and Intelligent Systems in Defense Applications”. New York, NY: IEEE SA (Accessed April 2. 2025) https://ieeexplore.ieee.org/document/10707139   -       IEEE Standards Association (Accessed April 2. 2025) https://standards.ieee.org/   -       Nadibaidze, Anna; Bode, Ingvild; Zhang, Qiaochu, 2024, “AI in Military Decision Support Systems: A Review of Developments and Debates”, Center for War Studies, SDU.    -       Overton Window, Wikipedia (Accessed May 13. 2025)   https://en.wikipedia.org/wiki/Overton_window   -       Renic, Neil and Christenson, Johan, 2024, “Drones, the Russo-Ukrainian War, and the Future of Armed Conflict”, CMS Report.    https://cms.polsci.ku.dk/english/publications/drones-the-russo-ukrainian-war-and-the-future-of-armed-conflict/.   -       The Overton Window, Mackinac Center for Public Policy (Accessed May 13. 2025) https://www.mackinac.org/OvertonWindow Music: Sofus Forsberg

    56 min
  2. The Analytical Engine feat. Lise Bach Lystlund, Jonas Nygreen, Lauritz Munch & Joshua Hatherley

    MAY 16

    The Analytical Engine feat. Lise Bach Lystlund, Jonas Nygreen, Lauritz Munch & Joshua Hatherley

    So, why are autonomous weapons systems such a big deal? Aren't they just weapons like the rest of them? Well, the black box problem with algorithmically controlled systems raises challenges different from those of "fire and forget" munitions. Four AI experts explain. The first part of this episode clarifies what an algorithm is when the black box appears and why it's important. The second part clarifies how the data the algorithms rely on can be biased and that constant maintenance and updates do not fix the problem. Shownotes: Producer and host: Sune With sunewith@cas.au.dk Cover art: Sebastian Gram References and literature:   -       Algorithmic bias, Wikipedia (Accessed April 8. 2025) https://en.wikipedia.org/wiki/Algorithmic_bias   -       Black Box, Wikipedia (Accessed April 8. 2025) https://en.wikipedia.org/wiki/Black_box   -       Blouin, Lou; Rawashdeh, Samir, March 2023, “AI´s mysterious “black box” problem. Explained”, NEWS University of Michigan-Dearborn (Accessed April 8. 2025) https://umdearborn.edu/news/ais-mysterious-black-box-problem-explained   -       Co-Coders (Accessed April 8. 2025) https://cocoders.dk/   -       ExekTek (Accessed April 8. 2025) https://exektek.com/   -       Hatherley, J. J., 2020, ”Limits of Trust in Medical AI. ” Journal of Medical Ethics, 46(7), 478-481.   -       Hatherley, J., Sparrow, R., & Howard, M. (2024).”The Virtues of Interpretable Medical AI. ” Cambridge Quarterly of Healthcare Ethics, 33(3), 323-332.   -       Hatherley, J. (2025).”A Moving Target in AI-Assisted Decision-Making: Dataset Shift, Model Updating, and the Problem of Update Opacity. ” Ethics and Information Technology, 27, 20. https://link.springer.com/article/10.1007/s10676-025-09829-2 - citeas   -       Hyperight, “The Black Box: What We´re Still Getting Wrong about Trusting Machine Learning Models” (Accessed April 8. 2025) https://hyperight.com/ai-black-box-what-were-still-getting-wrong-about-trusting-machine-learning-models/   -       Lystlund, Lise Bach (Accessed April 8. 2025) https://cocoders.dk/om-os/   -       Nygreen, Jonas (Accessed April 8. 2025) https://www.linkedin.com/in/jonasnygreen/ Music: Sofus Forsberg

    1h 15m
  3. Minotaur Warfare feat. Robert Sparrow

    MAY 9

    Minotaur Warfare feat. Robert Sparrow

    Robert Sparrow is a philosophical pioneer in the field of autonomous weapons. We discuss the current debate and the development since his famous articles "Killer Robots" and "Robots and Respect" were published. We explore the notion of male in se - evil in itself, and Rob presents his idea of Minotaur Warfighting or AI commanders. Rob also gives his perspectives on the black box, meaningful human control, programming ethics into machines and the value of fundamental human respect and recognition in war. Shownotes: Producer and host: Sune With, sunewith@cas.au.dk Cover art: Sebastian Gram -       Dige, Morten, 2012, ”EXPLAINING THE PRINCIPLE OF MALA IN SE”, Journal of Military Ethics, 11:4, 318-332 -       Orend, Brian, 2016, "War", The Stanford Encyclopedia of Philosophy (Spring 2016 Edition), Edward N. Zalta (ed.) https://plato.stanford.edu/archives/spr2016/entries/war/ - 2.2   -       Scharre, Paul, 2016, ”Centaur warfighting: the false choice of humans vs. Automation”. Temp. Int'l & Comp. LJ, 30, 151-165.   -       Scharre, Paul, 2018, ”Army of none: Autonomous weapons and the future of war”. WW Norton & Company. SHAPE, 2025, ”⁠NATO ACQUIRES AI-ENABLED WARFIGHTING SYSTEM”, NATO (Accessed April 14, 2025).  https://shape.nato.int/news-releases/nato-acquires-aienabled-warfighting-system-   -       Sparrow, Robert, 2016, ”Robots and Respect: Assessing the Case Against Autonomous Weapon Systems”, Ethics & International Affairs, 30, no. 1 (2016), pp. 93-116.   -       Sparrow, Robert, 2007, ”Killer Robots”, Journal of Applied Philosophy, Vol. 24, No. 1, pp. 62-77.   -       Sparrow, Robert, 2021. ”Why machines cannot be moral” AI & Society: Journal of Knowledge, Culture and Communication. https://doi.org/10.1007/s00146-020-01132-6.   -       Sparrow, Robert; Henschke, Adam, 2023, ”Minotaurs, Not Centaurs: The Future of Manned-Unmanned”, Parameters 53 (1), The Us Army War College Quarterly, pp. 115-130.   -       Sparrow, Robert, 2012 "ONE, Riskless Warfare Revisited: Drones, Asymmetry and the Just Use of Force". Ethics of Drone Strikes: Restraining Remote-Control Killing, Edinburgh: Edinburgh University Press, pp. 10-30.  https://doi.org/10.1515/9781474483599-004   -       Strawser, Bradley (edi), 2013, ”Killing by remote control -The Ethics of an Unmanned, Military”, Oxford University press.   -       TERMA, 2025, ”Multi-Domain”, (Accessed April 14. 2025) https://www.terma.com/products/multi-domain/ Music: Sofus Forsberg

    55 min
  4. Social War-bots? feat. Johanna Seibt

    MAY 5

    Social War-bots? feat. Johanna Seibt

    What is a social robot, and are autonomous drones social agents? What should a robot do or not do, and how is that connected to autonomous weapons? Johanna Seibt, professor in philosophy and co-founder of the field of robophilosophy, gives her insights on how interactions with robots that can act like agents impact our lives. What does it mean for our mutual respect on the battlefield if we let robots kill? What is phronesis, and in what way can an AI or a robot be programmed ethically? And what is the Collingridge dilemma? Besides offering her expertise on human-robot interaction, Johanna Seibt reflects on the black box problem, meaningful human control, existential risk and the future of regulating AI. Shownotes: Producer and host: Sune With, sunewith@cas.au.dk Cover art: Sebastian Gram -       Artificial Intelligence Act, 2025, Wikipedia (Accessed April 28. 2025) https://en.wikipedia.org/wiki/Artificial_Intelligence_Act   -       Collingridge dilemma, Wikipedia, (Accessed April 27. 2025) -       EU AI Act: first regulation on artificial intelligence, 19.02.2025, European Parliament (Accessed April 28. 2025) https://www.europarl.europa.eu/topics/en/article/20230601STO93804/eu-ai-act-first-regulation-on-artificial-intelligence   -       The Research Unit for Robophilosophy (Accessed April 27. 2025) https://cas.au.dk/en/robophilosophy⁠   -       Sullins, John P, 2026, ”Automated Ethical Practical Reasoning: The Problem of Artificial Phronesis”, Chapter 10 in, J. Seibt, R. Hakli, M. Nørskov, Robophilosophy—Philosophy of, for and by Social Robotics, MIT Press (NOT YET PUBLISHED)  https://en.wikipedia.org/wiki/Collingridge_dilemma  Music: Sofus Forsberg

    57 min
  5. Teaching Hearts and Minds feat. Lasse Kronborg

    APR 24

    Teaching Hearts and Minds feat. Lasse Kronborg

    Why would a military use an autonomous weapons system? What do they bring to the battlefield in Ukraine? Is AI in the military a revolution? And what and why should we teach future officers about tomorrow's weapons systems? In addition to giving a hands-on account from within the military, Lasse Kronborg, major at the Royal Danish Defence College, offers his perspectives on the black box, meaningful human control, how machines could be programmed, and what we can expect in the future. Shownotes: Producer and host: Sune With sunewith@cas.au.dk Cover art: Sebastian Gram Department of Defense, 2023, “DoD Directive 3000.09 Autonomy in weapons Systems” (Accessed April 2. 2025) https://www.esd.whs.mil/portals/54/documents/dd/issuances/dodd/300009p.pdf   General Assembly, United Nations, December 28. 2023, 78/241 Lethal Autonomous Weapons Systems (Accessed April 2. 2025) https://docs.un.org/en/a/res/78/241   IEEE SA Research Group on Issues of Autonomy and AI in Defense Systems. (2024). ”A Framework for Human Decision Making Through the Lifecycle of Autonomous and Intelligent Systems in Defense Applications”. New York, NY: IEEE SA (Accessed April 2. 2025) https://ieeexplore.ieee.org/document/10707139   IEEE Standards Association (Accessed April 2. 2025) https://standards.ieee.org/   Submission of the United States of America, 2023, Resolution 78/241 “Lethal Autonomous Weapons Systems” adopted by the United Nations General Assembly on 22. December 2023 (Accessed April 2. 2025) https://docs-library.unoda.org/General_Assembly_First_Committee_-Seventy-Ninth_session_(2024)/78-241-US-EN.pdf   Music: Sofus Forsberg

    50 min
  6. A Taste of Tragedy feat. Neil Renic

    APR 10

    A Taste of Tragedy feat. Neil Renic

    How does technology change the way we fight or think about war? Are autonomous weapons systems the silver bullet that will "solve war" and make it nice and clean? What impact have drones had on the war in Ukraine? Are autonomous weapons really used? What can we learn from tragedy when it comes to taking responsibility for how we fight? Why don't we just let the robots do the fighting? In addition to presenting his research, Neil Renic, lecturer in military ethics at the University of New South Wales, offers his perspectives on the black box problem, the idea of meaningful human control over weapons systems, programming ethics into machines, and the future development and regulation of autonomous weapons. Shownotes: Producer and host: Sune With sunewith@cas.au.dk Cover art: Sebastian Gram Dovidka.info, 2025, “Attack by drones dropping explosives: how to protect yourself” (accessed 12. March 2025) https://dovidka.info/en/in-the-combat-area/   Renic, Neil, 2024, “The Cost of Atrocity: Strategic Implications of Russian Battlefield Misconduct in Ukraine”. Ethics and International Affairs. doi:10.1017/S0892679424000054.     Renic, Neil and Benoit Pelopidas. 2024. “The Tragicized Politics of Nuclear Weapons and Armed Drones and the Making of Unaccountability”, Ethics and International Affairs. http://doi.org/10.1017/S0892679424000145.     Renic, Neil and Christenson, Johan, 2024, “Drones, the Russo-Ukrainian War, and the Future of Armed Conflict”,CMS Report.   https://cms.polsci.ku.dk/english/publications/drones-the-russo-ukrainian-war-and-the-future-of-armed-conflict/.    Renic, Neil, 2024, "Tragic Reflection, political wisdom, and the future of algorithmic war," Australian Journal of International Affairs. http://doi.org/10.1080/10357718.2024.2328299.    Renic, Neil and Schwarz, Elke, 2023, "Crimes of Dispassion: Autonomous Weapons and The Moral Challenge of Systematic Killing," Ethics and International Affairs. http://doi.org/10.1017/S0892679423000291.    Renic, Neil, 2023, "Remote Warfare: Trends, Drivers, Limits," in Routledge Handbook of the Future of Warfare. Edited by A. Gruszczak, S. Kaempf.   Renic, Neil, 2022, "Superweapons and the Myth of Technological Peace." The European Journal of International Relations. https://doi.org/10.1177/13540661221136764.    Renic, Neil, 2020, “Asymmetric Killing: Risk Avoidance, Just War, and the Warrior Ethos”, Oxford University Press. : https://global.oup.com/academic/product/asymmetric-killing-9780198851462?cc=de&lang=en&   Star Trek, 1967, ”A Taste of Armageddon” https://en.wikipedia.org/wiki/A_Taste_of_Armageddon   United Nations, 11. February 2025, “Short-range drones: The deadliest threat to civilians in Ukraine” (Assessed 12. March 2025) https://news.un.org/en/story/2025/02/1160016 Music: Sofus Forsberg

    1h 5m
  7. Just War feat. Iben Yde and Claus Borg Reinholdt

    APR 10

    Just War feat. Iben Yde and Claus Borg Reinholdt

    Just war means a justified war. But what justifies a war? Can letting machines kill be justified? How and where is war regulated? And how are the philosophical and legal frameworks of war connected? Iben Yde, lawyer and consultant at Rethink Advisory, explains and gives an account of her role as a military lawyer and advisor. Claus Borg Reinholdt, correspondent for TV2 Denmark, tells a horrifying story of a drone siege of Kherson, Ukraine. Besides explaining how war is regulated, Iben Yde provides her perspectives on the black box problem, the notion of meaningful human control, programming ethics into machines, and the future of regulating autonomous weapons through the CCW and GGE, UN system. Shownotes: Producer and host: Sune With sunewith@cas.au.dk Cover artwork: Sebastian Gram Arkin, C. Ronald, 2009, “Governing Lethal Behaviour in Autonomous Robots”, CRC Press. Arkin, Ronald C., 2010, ”The Case for Ethical Autonomy in Unmanned Systems”, Journal of Military Ethics, 9:4, 332-341 Convention on Certain Conventional Weapons (CCW) (Accessed Feb 25. 2025). https://disarmament.unoda.org/the-convention-on-certain-conventional-weapons/ Geneva Conventions and the rules of war with comments. ICRC (Accessed Feb 25. 2025). https://www.icrc.org/en/law-and-policy/geneva-conventions-and-their-commentaries Group of Governmental Experts, UN (Accessed Feb 25. 2025). https://disarmament.unoda.org/group-of-governmental-experts/ Orend, Brian, 2016, "War", The Stanford Encyclopedia of Philosophy (Spring 2016 Edition), Edward N. Zalta (ed.) (Accessed Feb 25. 2025)  https://plato.stanford.edu/archives/spr2016/entries/war/ - 2.2 RE-AIM Blueprint for Action, 2024 (Accessed Feb 25. 2025).  https://www.reaim2024.kr/home/reaimeng/board/bbsDetail.do?encMenuId=4e57325766362f626e5179454e6d6e4d4a4d33507a773d3d&encBbsMngNo=366e794c7a644d756342425668444f393053755142673d3d&encBbsNo=6f784e4542386f7735767465766a6531556f4b6149413d3d&ctlPageNow=1&schKind=bbsTtlCn&schWord=%23this Reinholdt, Claus Borg, November 11. 2024, News story on drone attacks in Kherson, Ukraine (Accessed Feb 28. 2025) https://nyheder.tv2.dk/video/2024-11-11-russerne-terroriserer-lokalbefolkningen-med-droneangreb-fortaeller-tv-2-korrespondent-6364543133112 Responsibility to Protect, R2P, UN, 2005 (Accessed Feb 25. 2025). https://www.un.org/en/genocide-prevention/responsibility-protect/about Rethink Advisory, 2025 (Accessed Feb 25. 2025).  https://www.rethinkadvisory.dk/ United Nations Charter, 26 June 1945 (Accessed Feb 25. 2025) https://www.un.org/en/about-us/un-charter United Nations, 11. February 2025, “Short-range drones: The deadliest threat to civilians in Ukraine” (Assessed 12. March 2025) https://news.un.org/en/story/2025/02/1160016 Walzer, Michael, 1977, 2015, “Just and Unjust Wars – a moral argument with historical illustrations”, Basic Books, fifth edition. Yde, Iben, Galasz, Dahlberg, 2021, ”Smart krig – Militær anvendelse af kunstig intelligens”, DJØF’s forlag.  Music: Sofus Forsberg

    57 min
  8. Stop the Killer Robots feat. Charlotte Akin

    APR 10

    Stop the Killer Robots feat. Charlotte Akin

    Who thinks autonomous weapons systems are a good idea? Well, 250+ NGOs don't. Charlotte Akin, Projects & Logistics Officer from the coalition "Campaign to Stop Killer Robots," explains how the coalition works and its main objections to autonomous weapons. Is there any chance for regulation? Besides presenting the campaign's work, Charlotte gives her perspectives on the black box, meaningful human control, programming ethics into robots and the perspectives for regulation. Antónoi Guterres and his Holiness Pope Francis have a few words on their own to say on the matter. Shownotes: Producer and host: Sune With sunewith@cas.au.dk Cover artwork: Sebastian Gram Antonio Guterres’s speech at the UNSC, December 2024. https://www.youtube.com/watch?v=3YuOC5wQk80   Automated Decision Team  https://automatedresearch.org/ Charlotte Akin, Campaign to Stop Killer Robots https://www.stopkillerrobots.org/our-team/   Mares, Courtney, 2024-07-10, “Pope Francis tells AI leaders: No machine should ever choose to take human life”, EWTN VATICAN.  https://www.ewtnvatican.com/articles/pope-francis-tells-ai-leaders-no-machine-should-ever-choose-to-take-human-life-3065   Stop Killer Robots https://www.stopkillerrobots.org/   Stop Killer Robots member organisations. https://www.stopkillerrobots.org/a-global-push/member-organisations/   The Manila Times, 16. June 2024, Pope Francis calls for ban on “lethal autonomous weapons at G7 meeting.  https://www.youtube.com/shorts/OEKRNA6OMAo   UN, Antonio Guterres stance on autonomous weapons.  https://disarmament.unoda.org/the-convention-on-certain-conventional-weapons/background-on-laws-in-the-ccw/   UN, 2023, different countries’ definitions of autonomous weapons. https://docs-library.unoda.org/Convention_on_Certain_Conventional_Weapons_-Group_of_Governmental_Experts_on_Lethal_Autonomous_Weapons_Systems_(2023)/CCW_GGE1_2023_CRP.1_0.pdf   UN, resolution A/C. I/79/L.77 https://reachingcriticalwill.org/images/documents/Disarmament-fora/1com/1com24/resolutions/L77.pdf Music: Sofus Forsberg

    1h 2m
  9. The Frontier of Automated Killing feat. Elke Schwarz

    2D AGO

    The Frontier of Automated Killing feat. Elke Schwarz

    Is AI just a tool? What is the difference between human cognition and systems logic? How do the automation and routinisation of war change how we fight and think of our enemy? Does Israel's target-generating AI systems work in the war in Gaza, and what is the result so far? Professor of Political Theory Elke Schwarz discusses her research, the black box, meaningful human control, programming ethics into machines, and the future of regulating autonomous weapons systems. Shownotes: Producer: Sune With, sunewith@cas.au.dk Cover artwork: Sebastian Gram Abraham, Yural, November 30. 2023, “A mass assassination factory: Inside Israels calculated bombing of Gaza”, +972Magazine. https://www.972mag.com/mass-assassination-factory-israel-calculated-bombing-gaza/   Abraham, Yural, April 3. 2024, “Lavender”:The AI machine directing Israels bombing spree in Gaza, +972Magazine. https://www.972mag.com/lavender-ai-israeli-army-gaza/   Davis, Harry; McKernan, Bentham; Sabbagh, Dan, December 1, 2023, “The Gospel: how Israel uses AI to select bombing targets in Gaza”, The Guardian. https://www.theguardian.com/world/2023/dec/01/the-gospel-how-israel-uses-ai-to-select-bombing-targets   Frankel Pratt, Simon, May 2. 2024, “When AI Decides Who Lives and Dies”, Foreign Policy. https://foreignpolicy.com/2024/05/02/israel-military-artificial-intelligence-targeting-hamas-gaza-deaths-lavender/   Gritten, David, October 9. 2023, “Israel´s military says it fully controls communities on Gaza border”, BBC. https://www.bbc.com/news/world-middle-east-67050127   Guardian, April 3. 2024, “Israel Defence Forces´ response to claims about the use of “Lavender” AI database in Gaza”. https://www.theguardian.com/world/2024/apr/03/israel-defence-forces-response-to-claims-about-use-of-lavender-ai-database-in-gaza    Krauss, Joseph, February 5. 2025, “Gaza is in ruins, and it´s unclear how it will be rebuilt”, Associated Press. https://apnews.com/article/israel-hamas-war-gaza-strip-reconstruction-trump-d6a6ff45583b7959403a8615469866d5   McKernan, Bethan; Davies, Harry, 3 April 2024, “The machine did it coldly´:Israel used AI to identify 37.000 Hamas targets. The Guardian. https://www.theguardian.com/world/2024/apr/03/israel-gaza-ai-database-hamas-airstrikes   No Comment TV, October 9. 2023, “We are fighting human animal”, said Israeli Defence Minister Yoav Gallant” (Accessed March 27. 2025) https://www.youtube.com/watch?v=ZbPdR3E4hCk   Schwarz, Elke, March 14. 2024, “Devalued Humanity: The Status of Human Life in Times of Nihilistic War”, Opinio Juris. https://opiniojuris.org/2024/03/14/devalued-humanity-the-status-of-human-life-in-times-of-nihilistic-war/   Schwarz, Elke, 2021, “Autonomous Weapons Systems, Artificial Intelligence, and the Problem of meaningful Human Control”, The Philosophical Journal of Conflict and Violence, Vol. V Issue 1.   Schwarz, Elke; Renic, Neil, 2023, “Crimes of Dispassion: Autonomous Weapons and the Moral Challenge of Systematic Killing”, Ethics and International Affairs, 37, no. 3, pp. 321-343.  Schwarz, Elke, 2018, “Death Machines – The ethics of violent technologies”, Manchester University Press.   Schwarz, Elke, April 12. 2024, “Gaza war: Israel using AI to identify human targets raising fears that innocents are being caught in the net”, The Conversation.  https://theconversation.com/gaza-war-israel-using-ai-to-identify-human-targets-raising-fears-that-innocents-are-being-caught-in-the-net-227422   UN, OHCHR, November 14. 2024, “UN Special Committee finds Israel´s warfare methods in Gaza consistent with genocide, including use of starvation as weapons of war”, UN.  https://www.ohchr.org/en/press-releases/2024/11/un-special-committee-finds-israels-warfare-methods-gaza-consistent-genocide   UN, UN News, November 22. 2024, “2024 deadliest year ever for aid workers, UN humanitarian office reports”, UN. https://news.un.org/en/story/2024/11/1157371   Music: Sofus Forsberg

    1h 14m

About

A series on autonomous weapons systems, drones and AI in the military domain. Experts from various disciplines share their research and discuss the black box, responsibility, human-machine interaction, and the future of legal and ethical frameworks for AI in war. How is war regulated? Can the ethics of war be programmed into machines? Does it change how we fight? Can war be cleaned up by technology? How can soldiers understand the systems? Will AI systems be the commanders of tomorrow? Why not just let the robots fight? Episodes are narration and interviews and not chronological

To listen to explicit episodes, sign in.

Stay up to date with this show

Sign in or sign up to follow shows, save episodes, and get the latest updates.

Select a country or region

Africa, Middle East, and India

Asia Pacific

Europe

Latin America and the Caribbean

The United States and Canada