Daily Bulletin


  • Written by Dr Anna Huggins, Senior Lecturer in Law, Queensland University of Technology
We need human oversight of machine decisions to stop robo-debt drama

Federal MP Amanda Rishworth raised concerns over the weekend that Australia could be headed for another robo-debt ordeal after the government reportedly confirmed the Australian Taxation Office (ATO) will use data matching to audit childcare rebates.

Government agencies increasingly use automated tools to make or facilitate decisions that affect citizens’ lives, but it’s not always appropriate for important decisions to be made by a computer.

In the European Union, the General Data Protection Regulation (GDPR) prohibits certain types of decisions from being solely automated. It also creates rights for individuals who are affected by automated processing.

We need similar safeguards in Australia for high stakes automated decisions made by government agencies.

Read more: Algorithms have already taken over human decision making

The rise of robotic decisions

The trend toward automation of government processes is accelerating in line with the government’s commitment to digital transformation.

Automated tools are now used to make or facilitate decisions in a range of government agencies, including decisions about welfare, tax, health, visas and veterans’ affairs. Centrelink’s employment income confirmation system, known as “robo-debt”, is a high profile example of what can go wrong with automated decision making.

Automation can improve the consistency and efficiency of government processes. But if there is bias or error in the computer program or data set, a flawed decision-making logic will be applied systematically, meaning large numbers of people could be affected.

Guidelines aren’t enforceable

The government has previously published guidelines on automated government decision making, including Best Practice Principles in 2004, and the Better Practice Guide in 2007. Both reports provide important advice about how to design automated systems to align with the values of public law.

But the recommendations in these reports aren’t enforceable. They also fail to create legal protections for those affected by automated decisions.

In May, there was public consultation about an artificial intelligence (AI) ethics framework for Australia. It highlighted the need for updated ethical principles to apply to new AI technologies. It also recommended a range of tools for improving the design of AI systems, including impact and risk assessments.

But, again, these recommendations will not be enforceable, even if they are included in the final framework. The current draft stops short of restricting the use of AI for certain types of decisions.

Read more: We need to know the algorithms the government uses to make important decisions about us

A new legal framework is needed

In contrast to Australia’s non-restrictive approach, legislative controls on data protection and automated decision making included in the GDPR are an example of best practice.

Article 22 of the GDPR is of particular interest for Australia. Unless specified exemptions apply, it prohibits the use of solely automated processing for decisions that produce legal or other significant effects for individuals.

To avoid this prohibition, decisions require meaningful human involvement and oversight. Having a human “rubber stamp” a decision made by automated outputs is insufficient.

Similar protections are needed in Australia, particularly for government decisions that affect individual rights and interests. Such safeguards would limit the types of government processes that can be fully automated.

‘Robo-debt’ would require meaningful human involvement under the GDPR

Let’s take a closer look at “robo-debt” to see how a prohibition on solely automated decision making might work.

The robo-debt system uses an automated data-matching and assessment process to raise welfare debts against people who the system flags as having been overpaid. Someone who receives a debt discrepancy notice can respond by giving income evidence to Centrelink. If no information is provided, an algorithm generates a fortnightly income figure by averaging income data from the ATO.

Of course, many welfare recipients have variable income as they are engaged in casual, part-time or seasonal work. It’s not surprising that the reliance on averaged data has led to a high number of reported errors. Receiving incorrect robo-debt notices has contributed to stress, anxiety and depression for many people.

One former member of Australia’s government review tribunal has described the system as a form of “extortion”.

If Australia had GDPR-type protections, meaningful human involvement would be required before an automated debt notice was sent. Manual review by human decision makers is important to ensure that a welfare debt is in fact owed.

There should also be restrictions on fully automating other high stakes decisions by government agencies. Decisions about visas and tax debts, for example, ought to be overseen by humans.

Read more: The new digital divide is between people who opt out of algorithms and people who don't

The private sector needs regulating too

Automated decisions made by private bodies that have significant impacts on individuals require legal safeguards too. Such protections are already included under the GDPR.

Similarly, in the United States, a bill for an Algorithmic Accountability Act has been proposed. If this bill is passed, it will require certain companies that use “high-risk automated decision systems” to conduct algorithmic impact assessments.

Australia’s non-binding guidance on automated decision making is a step in the right direction, but it needs to be bolstered by legislation that restricts the types of decisions that can be fully automated. This is particularly important for government decisions with serious consequences for individuals, like robo-debt and auditing of childcare rebates.

Authors: Dr Anna Huggins, Senior Lecturer in Law, Queensland University of Technology

Read more http://theconversation.com/we-need-human-oversight-of-machine-decisions-to-stop-robo-debt-drama-118691

Writers Wanted

It's bee season. To avoid getting stung, just stay calm and don't swat


The Conversation


Ray Hadley's interview with Scott Morrison

RAY HADLEY: Prime Minister, good morning.    PRIME MINISTER: G’day Ray.   HADLEY: I was just referring to this story from the Courier Mail, which you’ve probably caught up with today about t...

Ray Hadley & Scott Morrison - avatar Ray Hadley & Scott Morrison

Prime Minister's Remarks to Joint Party Room

PRIME MINISTER: Well, it is great to be back in the party room, the joint party room. It’s great to have everybody back here. It’s great to officially welcome Garth who joins us. Welcome, Garth...

Scott Morrison - avatar Scott Morrison

Prime Minister Interview with Ben Fordham, 2GB

BEN FORDHAM: Scott Morrison, good morning to you.    PRIME MINISTER: Good morning, Ben. How are you?    FORDHAM: Good. How many days have you got to go?   PRIME MINISTER: I've got another we...

Scott Morrison - avatar Scott Morrison

Business News

Features You Need in an Automated Employee Recognition Platform

Employee recognition platforms have been successfully implemented as a technique to study employee performance. It is a useful tool to reinforce particular behaviours, practices, or activities i...

News Co Media - avatar News Co Media

What Should You Check Before Ordering Promotional Mugs?

Promotional products like mugs are a great marketing tool because they are reusable and necessary. Moreover, mugs also come in handy while promoting a brand’s logo. They give better brand visibi...

News Co - avatar News Co

Tips to find the best plastic manufacturing supplier for your needs

Plastics are very much an important part of all of our lives, but they’re particularly valuable to a wide variety of industries that rely on their production for their operations. The industries, ...

News Co - avatar News Co

News Co Media Group

Content & Technology Connecting Global Audiences

More Information - Less Opinion