8 episodes

Welcome to SafeGround, the small organisation with big ideas working in disarmament, human security, climate change and victims of war. In our series 'Stay in Command', we talk about lethal autonomous weapons, the Australian context and why we mustn’t delegate decision making from human to machines. We are part of the International Campaign to Stop Killer Robots.

If you want to know more look for us on Facebook, Twitter and Instagram - Australia Campaign to Stop Killer Robots. Or use the hashtag #AusBanKillerRobots.

Become part of the movement so we Stay in Command!

Stay in Command John Rodsted

    • Business

Welcome to SafeGround, the small organisation with big ideas working in disarmament, human security, climate change and victims of war. In our series 'Stay in Command', we talk about lethal autonomous weapons, the Australian context and why we mustn’t delegate decision making from human to machines. We are part of the International Campaign to Stop Killer Robots.

If you want to know more look for us on Facebook, Twitter and Instagram - Australia Campaign to Stop Killer Robots. Or use the hashtag #AusBanKillerRobots.

Become part of the movement so we Stay in Command!

    Limits on Autonomous Weapons - ICRC Perspective

    Limits on Autonomous Weapons - ICRC Perspective

    In this episode of Stay in Command we showcase the ICRC perspective on autonomous weapons and the need for limits. Our guest is a weapons and disarmament expert, Neil Davison from ICRC's Department of Law and Policy at Geneva HQs bringing his own insight and the ICRC's view.
    Content in this Episode:
    ICRC mandate and approach to weapons issues [1:14]
    Humanitarian concerns of autonomous weapons [2:27]
    Legal Issues [5:40]
    Human Control as a core concept [8:14]
    Ethical perspective [10:36]
    Limits on autonomy [12:28]
    International talks and 'crunch-time'[16:59]
    If you have questions or concerns please contact us via info@safeground.org.au
    If you want to know more look for us on Facebook, Twitter and Instagram Australia Campaign to Stop Killer Robots or use the hashtag #AusBanKillerRobots.
    Become part of the movement so we Stay in Command!
    For access to this and other episodes along with the full transcription and relevant links and information head tohttps://safeground.org.au/podcasts/ ( safeground.org.au/podcasts).
    Our podcasts come to you from all around Australia and we would like to acknowledge the Traditional Owners throughout and their continuing connection to country, land, waters and culture.
    Stock audio provided by Videvo, downloaded from http://www.videvo.net/ (www.videvo.net)

    • 23 min
    Diplomatic Process and Progress on Killer Robots

    Diplomatic Process and Progress on Killer Robots

    In this episode of Stay in Command we discuss the diplomatic process and progress being made regarding lethal autonomous weapons systems. The episode featured Elizabeth Minor from UL-based NGO Article 36 which works to to prevent harm from weapons through stronger international standards and is on the steering committee of the Campaign to Stop Killer Robots. We look at how this issue has progressed and where it is going. We unpack key themes in the current debate and where the process must go.
    Content in this episode:
    Overview of this issue on the diplomatic stage to date [00:01:56]
    Human Control Unpacked [00:09:05]
    Human control in international debates[00:12:57]
    Notion of the so-called 'entire life cycle' of the weapon [00:14:50]
    False solution of proposed techno-fixes [00:18:35]
    Limitations of Article 36 Weapons Reviews[00:20:50]
    Debates around definition [00:25:58]
    Going beyond the guiding principles[00:29:28]
    Arriving at ‘Consensus recommendations’[00:30:31]
    How a Legally Binding Instrument might look [00:33:26]
    How we get there-diplomatic avenues[00:37:06]
    The need for decisive action and leadership[00:38:42]
    If you have questions or concerns please contact us via info@safeground.org.au
    If you want to know more look for us on Facebook, Twitter and Instagram Australia Campaign to Stop Killer Robots or use the hashtag #AusBanKillerRobots.
    Become part of the movement so we Stay in Command!
    For access to this and other episodes along with the full transcription and relevant links and information head tohttps://safeground.org.au/podcasts/ ( safeground.org.au/podcasts).
    Our podcasts come to you from all around Australia and we would like to acknowledge the Traditional Owners throughout and their continuing connection to country, land, waters and culture.
    Stock audio provided by Videvo, downloaded from http://www.videvo.net (www.videvo.net)

    • 41 min
    Who is in Command?

    Who is in Command?

    In today's episode, we speak with Paul Barratt AO.
    Paul Barratt was an Australian Government insider for well over 30 years. He entered the Australian Public Service in 1966 when Australia began committing troops to the war in Vietnam. Originally trained in Physics, his early roles were to look at issues arising from China’s emergence as a Nuclear power.
    This brought him into the Intelligence Community through these years.
    As his career progressed, he worked in senior positions in the Department of Trade, Primary Industry and Energy and the Business Council of Australia. Through 1998 and 1999 he was Secretary for the Department of Defence. It was this role that put him at odds with the government causing him to leave his career and become a vocal critic of how Australia goes to war.
    He is the co-founder and current President of Australian for War Powers Reform.
    Their aim is to change the legal decision making path that allows the serving Prime Minister to commit Australia to war without any debate or oversight.
    He has been a high-end insider in most levels of the Public Service and Government and has seen the power entrusted to leaders misused. This he aims to change.
    This discussion centres on reforming this power from the ‘Captains Call’ to parliamentary oversight. The direction then focuses on the development of Lethal Autonomous Weapons and how the decision to use them could be made.

    • 33 min
    A Commander’s View on Lethal Autonomous Weapons

    A Commander’s View on Lethal Autonomous Weapons

    Stay in Command. Banning Lethal Autonomous Weapons.
    An interview with Major General Michael Smith AO (ret).
    A commanders perspective Lethal Autonomous Weapons
    Interviewed by John Rodsted.
    In the ‘Stay in Command’ episode today, Maj. Gen. Mike Smith (Ret.) and John Rodsted from SafeGround explore the issues surrounding the development of Lethal Autonomous Weapons with Artificial Intelligence. The mechanics, ethics and application of this new technology paints a disturbing picture of a world where machines decide who will live and who will die.
    Hear directly from a commander’s point of view.

    Major General Michael Smith (ret) has spent his life leading others. He graduated the Royal Military College Duntroon as Dux of his year in 1971 and since commanded everything from a Platoon to a Brigade. His 34 years in the Australian Army had him in some complicated situations.
    He served as Australia’s Defence Adviser in Cambodia in 1994, and throughout 1999 was Director-General for East Timor. He was appointed as the first Deputy Force Commander of the United Nations Transitional Administration in East Timor (UNTAET) in 2000-2001, in recognition he was awarded the Order of Australia.
    After the army he was CEO of the Australian Refugee Agency Austcare from 2002 until 2008. He then set up the Australian Civil-Military Centre from 2008 until 2011. He is the last President of the United Nations Association in Australia and is the current chair of the Gallipoli Scholarship.
    Michael holds a Masters degree in International Relations from the Australian National University, a Bachelor of Arts in History from the University of New South Wales, and is a Fellow of the Australian College of Defence and Strategic Studies. He is also a graduate of the Cranlana leadership program and the Company Directors Course at the University of New England.
    Today we will talk about leadership both civil and military and the complexities of command responsibility in regards to Lethal Autonomous Weapons.

    Content in this episode:
    "The Buck Stops Here" [00:02:33]
    Legal Framework for Commanding in Conflict [00:05:10]
    Introducing Lethal Autonomous Weapons to the battlefields [00:08:54]
    The Nature of Wars [00:16:12]
    A Possible Arms Race? [00:21:51]
    Technology Development [00:25:05]
    The Fog of War Continues [00:32:43]
    Making The Decision To Go To War [00:38:27]
    Banning these Lethal Autonomous Weapons? [00:41:44]

    • 47 min
    The Tech Perspective with Lizzie Silver

    The Tech Perspective with Lizzie Silver

    The Tech Perspective with Lizzie Silver:
    Technological Aspects and Tech Industry
    This episode of Stay in Command emphasised the technological dimensions and concerns as well as the implications on lethal autonomous weapons on the tech industry. Our guest Dr Lizzie Silver is a Senior Data Scientist at Melbourne-based AI company Silverpond.
    Content in this episode:
    Troubling reality of these weapons [1:49]
    Problems with fully autonomous weapons - explainability[3:49]
    Facial recognition and bias[7:11]
    Military benefits from technical point of view [11:36]
    Machines and the kill decision [15:01]
    Hacking [16:30]
    Positive uses of AI and funding battle [17:10]
    Challenge of Dual Use [20:45]
    Regulation: Treaty, Company Policy, Individual Actions [22:16]
    If you have questions or concerns please contact us via info@safeground.org.au
    If you want to know more look for us on Facebook, Twitter and Instagram Australia Campaign to Stop Killer Robots or use the hashtag #AusBanKillerRobots.
    Become part of the movement so we Stay in Command!
    For access to this and other episodes along with the full transcription and relevant links and information head tohttps://safeground.org.au/podcasts/ ( safeground.org.au/podcasts).
    Transcript:
    Welcome to SafeGround, the small organisation with big ideas working in disarmament, human security, climate change and refugees. I’m Matilda Byrne.
    Thank you for tuning in to our series Stay in Command where we talk about lethal autonomous weapons, the Australian context and why we mustn’t delegate decision making from humans to machines.
    This episode we’re looking at the “Tech Perspective”. We are going to discuss the technological concerns of lethal autonomous weapons and their implications on the tech industry.
    And so with me today I have a great guest with me today in Dr Lizzie Silver. Lizzie is a Senior Data Scientist at Silverpond which is an AI company based in Melbourne, which is also where I am coming to you from - so welcome Lizzie, thanks so much for joining us today
    Lizzie Silver[00:00:52] Thanks for having me
    Matilda Byrne: Before we jump in, I’m just going to talk a bit about the definition of killer robots in case any of our listers are unfamiliar with exactly what it is we’re talking about.
    So killer robots or fully autonomous weapons are weapons that have no human control over the decision making. So when they select a target and engage the target so decide to deploy lethal force on that target, there is not a human involved in that process and it is just based on AI and algorithms. So with these fully autonomous weapons there are lots of concerns that span a whole of areas that span a number of different areas - today we are going to go into technological concerns in particular because we have Lizzie and her expertise, but there's also things like moral, ethical, legal global security - a whole host of concerns really.
    What is the most concerning thing about killer robots?[00:01:49]
    Matilda Byrne: And what I’m interested in Lizzie, is, just to start off with if you could tell us what is it about fully autonomous weapons that you find the most worrying, so what about them makes you driven to oppose their development.
    Lizzie Silver: It’s really a fundamental issue with these issues is you can't give a guarantee on how they’re going to behave. WIth humans we can’t give a guarantee on how they're going to behave but that’s why we have all these mechanisms for holding a human accountable. Now you can’t hold an algorithm accountable in any meaningful way. So what you would like to do is find a way to characterise how it’s going to behave in every situation, but the thing is a conflict situation is just too complex. There are too many potential inputs and outputs, different scenarios that could confront the AI. You’re never going to get through all of them. You’re never going to be able to fully characterise the space. So what you’d like to say is say ‘Ok, on thi

    • 27 min
    Student views on Killer Robots

    Student views on Killer Robots

    Yennie Sayle is completing her studies of a Bachelor of International Studies at RMIT University and is the Youth Engagement intern with SafeGround for the Campaign to Stop Killer Robots Australia.
    She sits down with three other students from different areas of studies and experiences to talk killer robots, the level of exposure in their degrees to the topic, their views, concerns, thoughts on university involvement in their development, raising awareness among students and more.

    • 18 min

Top Podcasts In Business