The Perils of Automated Discrimination in the Workplace

Category Business

tldr #

Amazon wanted to use AI algorithms to expedite its recruitment process, but the AI algorithms soon replicated the existing biases, despite attempts to make them gender-neutral. Australia's privacy and data protection laws are lagging behind countries like the UK and the EU, and most cases of AI-based discrimination will not be detected. Discrimination law in Australia relies mostly on individuals making a complaint, with few people doing so.


content #

Amazon thought it had found an efficient way to find the best workers. Recruitment is time consuming and expensive, so why not outsource it to artificial intelligence (AI)? Their team built an AI-based algorithm—a series of instructions telling a computer how to analyze data—that would give each candidate a score from one to five stars. They could then simply choose the candidates with five stars.

AI algorithms can often replicate existing biases and even go undetected from employers

But there was a problem. It turned out that women didn't score well for software and tech jobs. What was going on? .

Well, the algorithm was trained on CVs submitted to Amazon over the previous 10 years, and most came from men. The algorithm had "learned" that men were to be preferred. It awarded more stars for masculine language in a CV and took off stars for anyone who went to a women's college.

The algorithm had been taught to discriminate, copying human bias.

AI algorithms are being used in Australia's private sector, but often with limited knowledge of its use

Other studies have found that AI can pick up gender signals in a CV, even when a name and pronouns are removed. And, even if AI is trained to be gender-neutral, it might still discriminate against parents or other vulnerable employee groups, like those who are racially or culturally diverse or LGBTQI+.

But most cases of AI-based discrimination won't be reported. Or maybe even noticed. And that is a big problem.

Australia has only limited privacy and data protection law for employee records, not covered by the federal Privacy Act 1988 (Cth)

In a detailed analysis of Australian workplace laws, published in the Melbourne University Law Review, I found there is little known about how Australian employers are using AI.

There are many software tools that use AI to streamline human resource functions—from recruitment to performance management and even to dismissal. But how these are being used is often only revealed when things go really wrong.

EU has GDPR laws that demands the final say in any automated process be by a human decision-maker

For example, the Australian Public Service tried using AI-assisted technology to manage promotions. Many of these promotions were later overturned for not being based on merit, but this was only revealed because the Public Service has a dedicated Merit Protection Commissioner.

What happens in the private sector, where most people work? .

Europe has strong privacy and data protection laws—the General Data Protection Regulation (GDPR)—that demand a human decision-maker have the final say in any automated process that significantly affects people's lives. In the EU, gig workers have used this to challenge Uber and Ola when they were automatically terminated as drivers.

Gig workers in the EU has used GDPR to challenge Uber and Ola when they were automatically terminated as drivers

But Australia has no equivalent.

Australian privacy law significantly lags behind countries like the UK and the European Union. Incredibly, it contains a blanket exception for "employee records"—while your employer needs your consent to initially gather new data, there are no limits placed on that data once it is held.

And the federal Privacy Act 1988 (Cth) does not apply to small businesses, which employ most Australian workers.

Discrimination law in Australia relies mostly on individuals making a complaint, with few people actually doing so

Discrimination law might fill this gap if it can be shown that an AI algorithm discriminates against certain people or groups. But if we don't know that an algorithm is being used, how do we challenge it? .

Discrimination law mostly relies on individuals making a complaint—and few people do, even when they know they have been discriminated against. With automate discrimination, those affected might not even know it occurred.

The Australia Public Service tried using AI-assisted technology to manage promotions, but was later overturned for not being based on merit

hashtags #
worddensity #

Share