Why we need a culture of accountability around algorithms Actually Interesting

    • Education

In the sixth episode of Actually Interesting, The Spinoff’s monthly podcast exploring the effect Artificial Intelligence has on our lives, Russell Brown looks at the draft algorithm charter, the government's commitment to transparent and accountable use of AI.


In the Star Trek Voyager episode 'Critical Care', the Doctor – well, actually, the mobile emitter that produces him as a hologram – is stolen and sold to an alien hospital ship. There, he discovers that the complex computer algorithm that determines treatment, the Allocator, dishes out lifesaving care according to each patient's Treatment Coefficient – which measures not need, but an individual's value to society. To lower-value patients, the computer just says no, without explanation. They die.


Mandy Henk was home sick herself recently when she watched the episode. And she swiftly recognised that this was a dystopian sci-fi story about a real thing used in the government sector here on Earth, in New Zealand: an operational algorithm.
We do not dispense medical treatment on the basis of individuals' deemed social value. But we do use algorithms to make a bunch of other decisions: provisioning school bus routes, predicting which young school-leavers are in danger of falling through the cracks, triaging visa applications.


The government uses algorithms to make all kinds of decisions, from provisioning school bus routes and predicting which young school-leavers are in danger of falling through the cracks to triaging visa applications Henk, the CEO of Tohatoha, the organisation formerly known as Creative Commons NZ, is one of a number of people looking closely at the draft algorithm charter published recently by Statistics NZ. It's the government's most concrete commitment yet to transparent and accountable use of algorithms, AI and other sophisticated data techniques. It's timely.


"I think it's probably past time," says Henk. "Given the amount of algorithms currently used throughout government, we're probably overdue for a commitment on the part of government to use them in ways that ensure equity and fairness."
"We have passed the point where we need to have this conversation," agrees data scientist Harkanwal Singh. "It's urgently needed. We need a robust conversation and real action."


Both Henk and Singh welcome the draft charter as a useful statement of principles – and both believe it needs to be clarified and strengthened. For instance, it commits public entities to "upon request, offer technical information about algorithms and the data they use" – which implies there needs to be someone doing the requesting. But who, and how?
"That is not clear at the outset," says Singh. "It would be better if the language made it clear. Also, why 'upon request?’ Being open by default is much better and creates a culture of accountability. We do not want a repeat of the OIA experience."




"In my first career, I was a librarian and I spent 20 years doing that," adds Henk. "The ability of people to understand their own information needs is actually fairly limited. People have to know what they don't know in order to make good requests for information. I would prefer to see government being more proactive about this, and providing that information to the people who are going to be impacted at the front end." Part of the issue with transparency and explanation is that there's only a small subset of society that currently understands how algorithms work. "I don't know that I understand how algorithms work," says Henk. "And I don't think anyone outside of a fairly narrow group of people has a very good understanding of how algorithms work. Which is one reason why transparency isn't particularly helpful. If you need to get a degree in computer science in order to understand it, that's an awful lot to ask of the public."
"It might be difficult to explain intern

In the sixth episode of Actually Interesting, The Spinoff’s monthly podcast exploring the effect Artificial Intelligence has on our lives, Russell Brown looks at the draft algorithm charter, the government's commitment to transparent and accountable use of AI.


In the Star Trek Voyager episode 'Critical Care', the Doctor – well, actually, the mobile emitter that produces him as a hologram – is stolen and sold to an alien hospital ship. There, he discovers that the complex computer algorithm that determines treatment, the Allocator, dishes out lifesaving care according to each patient's Treatment Coefficient – which measures not need, but an individual's value to society. To lower-value patients, the computer just says no, without explanation. They die.


Mandy Henk was home sick herself recently when she watched the episode. And she swiftly recognised that this was a dystopian sci-fi story about a real thing used in the government sector here on Earth, in New Zealand: an operational algorithm.
We do not dispense medical treatment on the basis of individuals' deemed social value. But we do use algorithms to make a bunch of other decisions: provisioning school bus routes, predicting which young school-leavers are in danger of falling through the cracks, triaging visa applications.


The government uses algorithms to make all kinds of decisions, from provisioning school bus routes and predicting which young school-leavers are in danger of falling through the cracks to triaging visa applications Henk, the CEO of Tohatoha, the organisation formerly known as Creative Commons NZ, is one of a number of people looking closely at the draft algorithm charter published recently by Statistics NZ. It's the government's most concrete commitment yet to transparent and accountable use of algorithms, AI and other sophisticated data techniques. It's timely.


"I think it's probably past time," says Henk. "Given the amount of algorithms currently used throughout government, we're probably overdue for a commitment on the part of government to use them in ways that ensure equity and fairness."
"We have passed the point where we need to have this conversation," agrees data scientist Harkanwal Singh. "It's urgently needed. We need a robust conversation and real action."


Both Henk and Singh welcome the draft charter as a useful statement of principles – and both believe it needs to be clarified and strengthened. For instance, it commits public entities to "upon request, offer technical information about algorithms and the data they use" – which implies there needs to be someone doing the requesting. But who, and how?
"That is not clear at the outset," says Singh. "It would be better if the language made it clear. Also, why 'upon request?’ Being open by default is much better and creates a culture of accountability. We do not want a repeat of the OIA experience."




"In my first career, I was a librarian and I spent 20 years doing that," adds Henk. "The ability of people to understand their own information needs is actually fairly limited. People have to know what they don't know in order to make good requests for information. I would prefer to see government being more proactive about this, and providing that information to the people who are going to be impacted at the front end." Part of the issue with transparency and explanation is that there's only a small subset of society that currently understands how algorithms work. "I don't know that I understand how algorithms work," says Henk. "And I don't think anyone outside of a fairly narrow group of people has a very good understanding of how algorithms work. Which is one reason why transparency isn't particularly helpful. If you need to get a degree in computer science in order to understand it, that's an awful lot to ask of the public."
"It might be difficult to explain intern

Top Podcasts In Education

More by The Spinoff