The Foil

The Foil Podcast

The Foil podcast discusses the data age—what it means for you and what it could mean for us all. Kristi Mansfield and Adam Peaston talk with leaders in the fields of data science, machine learning, artificial intelligence, public policy and social change about the power of data as a tool to benefit society. We also discuss the opportunities and risks, and how we can deal with them.www.thefoil.aiwww.seerdata.ai Hosted on Acast. See acast.com/privacy for more information.

  1. Regulation to defend democracy from Big Tech

    31/05/2022

    Regulation to defend democracy from Big Tech

    Chris Cooper is a cultural anthropologist and Executive Director at Reset Australia, the Australian affiliate to the international Reset network and think tank working to drive public policy to tackle digital threats to democracy. Chris is also Senior Campaign Director at Purpose, an international social impact agency supporting leading activists and companies to develop strategy that can shift policies and change public narratives.   Chris comments on the current state of Australian tech regulation. We discuss how to identify bad actors and bad content online. Chris shares his definition of “mis” and “dis” information, a key focus for Reset. Both “mis” and “dis” information is false information that is shared. Mis-information is shared without the sharer knowing it is false, whereas dis-information is shared by sharer who knows it is false.   Chris describes efforts at Reset to build on the work of the “age-appropriate design code” from the UK, and the “best interests principle” which requires that digital platforms that children are likely to use must prove that they are designed and operating with the best interests of those children in mind.   Chris relates the key objectives of Reset for policy change. Regulation on digital platform accountability and responsibility.Regulation on eliminating risks from systems and processes, giving regulators more oversight over the design of systems in use by companies.Regulation to address community and societal risks; one person misinformed is not so problematic but a fragmented society consuming two different versions of the truth is a problem for democracy.Establishment of regulatory responsibility in these areas with a new regulator or existing agency.Equipping regulators with powers to enforce regulation with penalties proportionate to the scale of harms caused.  We ask Chris for his thoughts on the issue of foreign interference in Australia’s democratic system. Chris makes the case for increased transparency from digital platforms that are a significant source of information for the Australian citizenry. Chris asserts that polarisation of public opinions on critical issues, as well as proliferation of hate speech and racism, is exacerbated by social media, and that regulation is required to address this.   https://au.reset.tech/ https://www.purpose.com/ https://seerdata.ai     Hosted on Acast. See acast.com/privacy for more information.

    59 min
  2. Disinformation, Democracy & Elections

    20/05/2022

    Disinformation, Democracy & Elections

    Katie Harbath is a global leader at the intersection of elections, democracy, civic and tech. Katie was the public policy director at Facebook for 10 years and is credited with building out and leading a global team responsible for managing elections. She played a significant role in getting governments and elected officials around the world - at the local, regional and national levels - to use Facebook and Instagram as a way to connect and engage with constituents. In this Episode, we delve into what you need to know on the eve of the Australian Federal Election. Katie helps us understand the dilemmas, hard trade offs and decisions for social media platform products and policies that set the rules to manage the spread of misinformation, disinformation and mal-information. She talks us through the impact of data and digital on elections and democracy. We explore Elon Musk’s announcement of the purchase of Twitter. Katie calls on the need for action and plans to build the guardrails for social media platforms to protect integrity and reduce harm to democracy. We discuss the need for leaders, product owners and campaigners to admit what has worked and hasn’t to reduce bad outcomes that denigrate democracy. Katie discusses product and legislations for protections. She gives advice on what behaviours we can all take to reduce the spread of misinformation and talks about what we can expect in the future. www.anchorchange.com www.seerdata.ai Hosted on Acast. See acast.com/privacy for more information.

    47 min
  3. Responsible Tech & Human Rights in AI

    27/04/2022

    Responsible Tech & Human Rights in AI

    In our first Episode of our Responsible Tech Series in the lead up to the Australian Federal Election, we speak with Edward Santow who is the Industry Professor for Responsible Tech at the University of Technology in Sydney. Prior to his current role, Ed was the Australian Human Rights Commissioner. During his tenure, he led the world’s largest public consultation on human rights and technology and published a public report with recommendations for the development of responsible tech. In this Episode, we talk with Ed about his early experiences working as a lawyer in community legal services where he saw first-hand the impact of tech applications gone wrong in policing. We discuss the pivotal moment when public attitudes shifted away from complacency to real public concern for responsible use of data and tech; when Cambridge Analytica used personal data belonging to millions of Facebook users collected without their consent to provide analytical assistance to the 2016 presidential campaign of Donald Trump.  Ed outlines the three key vectors for responsible tech: the law, training and design. We explore regulation and legislation as it currently exists and that through enforcement of the current law, “80% of problems would go away.” Ed presents the recommendation of an impact assessment before use of AI for automated decision making. We discuss the future expectations from the public and the challenge for policy makers. Ed highlights the application of AI in today’s business and public sector context noting that 85% of AI projects fail and why this is the case. We discuss facial recognition technology and the risks, and the need to build data capabilities across society in the data and digital age. https://profiles.uts.edu.au/Edward.Santow seerdata.ai Hosted on Acast. See acast.com/privacy for more information.

    47 min
  4. Quorum Breaker

    19/04/2022

    Quorum Breaker

    State Rep. Claudia Ordaz Perez represents Texas’s House District 76 in El Paso County. She is the former Mayor Pro Tempore and City Councilwoman for the City of El Paso, where she was an advocate for working parents and family caregivers. At City Council, she was successful in creating local policies impacting living wages for workers, local park enhancements for children, funding for new infrastructure for municipal police and fire departments, local animal shelter improvements, and promoting investment opportunities to expand job growth in the Borderplex region. In 2021, Rep. Ordaz Perez was among a group of Texan Democrats who broke quorum to halt a legislative session in Texas and fight a controversial voting rights bill.  The law added new identification requirements for voting by mail, banned 24-hour voting and drive-through voting and established uniform voting hours in the state. Republicans argued it was needed to ensure election integrity. Democrats said the new proposed rules disproportionately affected minority voters and they fled Texas to break quorum as a result. Busting the quorum isn’t unheard of — in fact, it has happened at least two other times in Texas political history. But it is considered a nuclear option, a last resort when the debate has shut down and one side believes it’s being railroaded. As their quorum-breaking departure captured attention around the world, the Texas' Democrats' drastic move to break quorum was hard to ignore. And while they may not have spurred immediate federal change in their favour, this dramatic walkout halfway across the country marked a new inflection point in the national voting rights debate and shaped Texas politics forever. In this Episode, Rep. Ordaz Perez shares the needs of the borderplex community in El Paso, the changes in legislation that drove her to work with fellow Representatives to break quorum, the development of the “black and brown” movement led by women, the reception in Washington D.C. and the importance of data informed discussion on critical legislation to protect democratic process in the United States.   https://house.texas.gov/members/member-page/?district=76 www.seerdata.ai Hosted on Acast. See acast.com/privacy for more information.

    1h 2m
  5. Data is Power

    16/03/2022

    Data is Power

    Stefaan Verhulst is Co-Founder and Chief of Research and Development at The GovLab, New York University. Stefan founded GovLab with the goal of strengthening the ability of institutions and people to work more openly, collaboratively, effectively, and legitimately to make better decisions and solve public problems. Stefaan says the COVID-19 pandemic has been a watershed moment in which we’ve realised that we don’t have access to a lot of the data we need and that we need to unlock data assets that could be used to save lives. Stefaan advocates for more institutions to “publish [data] with purpose” by identifying a public interest benefit for which the data is required. Stefaan describes advances in the disciplines of formulating purpose specifications, problem specification, and question definition which requires a skillset that many policy professionals assume they have but often don’t. Stefaan emphasises the importance of inclusivity in question formulation. Stefaan admonishes us to pursue not just data equity, but also question equity, in order that the questions for which answers are sought and metrics are developed are those that really matter to society. Stefaan observes that power dynamics are determined by asymmetries, such as the data “haves” and the data “have nots”. Stefaan quotes Sir Francis Bacon who said, “knowledge is power” asserting that in the 21st century “data is power”. Stefaan describes a variety of data asymmetries such as between consumers and corporations, between citizens and government, and between business and government. Stefaan argues that addressing these asymmetries is essential for achieving “digital self-determination” for individuals and groups. Stefaan acknowledges some tensions between the ideal of data sharing and reuse for public benefit, and of digital self-determination where these principles interface at the concept of privacy. Stefaan says this balance will not be easy to find but argues that with data we need to go beyond consent and aim to avoid not just misuses, but also missed uses. Stefaan believes legislation will be inadequate for arbitrating all specific circumstances, and that Data Stewards as a profession will need to be skilled in evaluating the appropriateness of the purpose and fitness of the data for sharing and empowered to do so. www.seerdata.ai  www.thefoil.ai  Hosted on Acast. See acast.com/privacy for more information.

    38 min
  6. Virtual Reality Sexual Assault, AI Risks to Women & Bias

    07/03/2022

    Virtual Reality Sexual Assault, AI Risks to Women & Bias

    On International Women’s Day we celebrate by speaking with Dr Catriona Wallace who is a mother and a global leader in AI ethics. She sits on numerous boards and educates leaders around the world on mitigating unintended harms from AI.  Catriona discusses the recent emergence of metaverses; immersive virtual worlds where users can interact in new and creative ways. Catriona relates a recent sexual assault incident in which a woman Nina Jane Patel was virtually assaulted within Horizon, a metaverse created by Meta, and another incident in which the owner of a virtual residence found that their virtual dwelling was being squatted in and there was no clear recourse to justice.  We discuss the risks of bias in AI algorithms and how women have historically been under-valued by AI systems tasked with recommending job candidates for Amazon or estimating customer creditworthiness for Goldman Sachs and Apple. Catriona argues that this bias stems from inadequate representation of women in the data used to train the AI systems, and under-representation of women in the field of Data Science. Catriona observes that 85 million jobs will be replaced by AI systems, and that 90% of these jobs are held by women and minorities.  Catriona argues that the responsibility for AI-enhanced real-world decisions should remain with business owners, not the technical teams who develop the AI systems. Catriona relates her experience as the Executive Director at the Gradient Institute of training boards and executives who have very little understanding of AI.  Catriona describes how it is predominantly young men who are creating datasets, for example by manually labelling images, and that this is one way in which bias is introduced into AI systems. Catriona talks about the work of the Gradient Institute training Data Scientists to code ethically and teaching Data Scientists about tools that are available for assessing whether their work is having unintended consequences. Catriona advocates for regular AI systems assessments by external assessors to provide Data Scientists with feedback about how they can be more responsible.  Catriona shares the recent release of Australia’s first Responsible AI Index by Fifth Quadrant, Ethical AI Advisory, and the Gradient Institute. The research found that only 8% of organisations had any type of Responsible AI maturity. Organisations can measure their own Responsible AI maturity using the Responsible AI Self-Assessment Tool (fifthquadrant.com.au).  Catriona observes that many of the entry-level, administrative, and customer service jobs that will be automated by AI systems in the coming years are typically held by women and minorities, and that Australia needs another 160,000 Data Scientists to keep pace with global industry. www.seerdata.ai   www.thefoil.ai   Hosted on Acast. See acast.com/privacy for more information.

    41 min
5
out of 5
8 Ratings

About

The Foil podcast discusses the data age—what it means for you and what it could mean for us all. Kristi Mansfield and Adam Peaston talk with leaders in the fields of data science, machine learning, artificial intelligence, public policy and social change about the power of data as a tool to benefit society. We also discuss the opportunities and risks, and how we can deal with them.www.thefoil.aiwww.seerdata.ai Hosted on Acast. See acast.com/privacy for more information.