Why we need a culture of accountability around algorithms
In the Star Trek Voyager episode 'Critical Care', the Doctor – well, actually, the mobile emitter that produces him as a hologram – is stolen and sold to an alien hospital ship. There, he discovers that the complex computer algorithm that determines treatment, the Allocator, dishes out lifesaving care according to each patient's Treatment Coefficient – which measures not need, but an individual's value to society. To lower-value patients, the computer just says no, without explanation. They die.
Mandy Henk was home sick herself recently when she watched the episode. And she swiftly recognised that this was a dystopian sci-fi story about a real thing used in the government sector here on Earth, in New Zealand: an operational algorithm.
We do not dispense medical treatment on the basis of individuals' deemed social value. But we do use algorithms to make a bunch of other decisions: provisioning school bus routes, predicting which young school-leavers are in danger of falling through the cracks, triaging visa applications.
The government uses algorithms to make all kinds of decisions, from provisioning school bus routes and predicting which young school-leavers are in danger of falling through the cracks to triaging visa applications Henk, the CEO of Tohatoha, the organisation formerly known as Creative Commons NZ, is one of a number of people looking closely at the draft algorithm charter published recently by Statistics NZ. It's the government's most concrete commitment yet to transparent and accountable use of algorithms, AI and other sophisticated data techniques. It's timely.
"I think it's probably past time," says Henk. "Given the amount of algorithms currently used throughout government, we're probably overdue for a commitment on the part of government to use them in ways that ensure equity and fairness."
"We have passed the point where we need to have this conversation," agrees data scientist Harkanwal Singh. "It's urgently needed. We need a robust conversation and real action."
Both Henk and Singh welcome the draft charter as a useful statement of principles – and both believe it needs to be clarified and strengthened. For instance, it commits public entities to "upon request, offer technical information about algorithms and the data they use" – which implies there needs to be someone doing the requesting. But who, and how?
"That is not clear at the outset," says Singh. "It would be better if the language made it clear. Also, why 'upon request?’ Being open by default is much better and creates a culture of accountability. We do not want a repeat of the OIA experience."
Learn more about your ad choices. Visit megaphone.fm/adchoices