Register For Our Mailing List

Register to receive our free weekly newsletter including editorials.

Home / 341

When algorithms go rogue the havoc is all too human

From recommending a movie on Netflix to processing a job application, algorithms are increasingly part of the decision-making processes of our everyday lives.

But alongside this has come a growing awareness of the negative effects of algorithmic decision-making, with individuals denied access to social security or health insurance, or even spending longer in jail, all without any human intervention.

We have coined the term 'algorithmic pollution' to describe this phenomenon of unjustified, unfair, discriminatory or other harmful consequences of autonomous algorithmic decision-making. 

Managers and politicians typically get excited about new technologies, especially artificial intelligence (AI). They believe that we are developing powerful algorithms that are non-biased, efficient and better decision-makers than humans, which is a myth. Algorithms learn from human beings, and they learn their biases. 

Lack of understanding

Hiring algorithms, for example, draw on historical data to identify characteristics that predict job performance. As they learn from past decisions, algorithms reinforce historical biases, despite the best intentions of designers. 

And algorithms often rely on unchecked and potentially inaccurate data sets, which increases the risk of wrong and unfair decisions. 

Consider the case of a highly trained and experienced job candidate who was not short-listed. An investigation found that the algorithm rejected the candidate based on data from pharmacies where the candidate was once prescribed anti-depressants. 

Perhaps most bizarre of all is predictive policing. Almost 400,000 Chicago residents now have an official police ‘risk score’ calculated by an algorithm. While still secret and publicly unaccountable, these risk scores are used for decision-making and shape policing strategy.

There’s a worrying lack of understanding of the inherent limitations and dangers of algorithmic decision-making among politicians, managers and other professionals. While algorithms can be very useful in complex calculations and support decision-making, they cannot replace human judgment.

Unfair impacts

Naz Guler, a director at PwC who works in the area of transformation delivery, says:

“Increasingly, both in business and in government, I’m seeing that decisions are being entrusted to technology, and there’s a growing belief that ‘tech is always right’. I see increasing investment into AI systems and a growing reliance on AI models, without having a clear understanding of their capabilities, knowledge or training processes. 

We need to slow down and think clearly about what we’re doing, and the potential unfair impacts we will have. I would like to see businesses doing more to establish the foundations of trust in their algorithms and models.”

Guler sees the drive towards algorithmic solutions as the result of the convergence of huge increases in technology (such as data storage and computer power) plus the exponential growth of the volume of available data.

AI has enormous potential to improve public policy and services, as Guler believes data analytics can enable governments to develop policies to create a more equitable society, with better personalisation and customisation of services. She says:

“There’s also been a realisation that data itself is really valuable. When it’s augmenting human efficiencies, it contributes in a positive way. The danger comes when we rely on data-driven decisions with no human intervention or consideration.

In my view, if there’s a bias within the data, you will get inadvertent decisions. A good example here is facial recognition systems, where there are a lot of examples of racial bias. There’s a lack of transparency about what data we’re using and how that data is relevant in any given decision.”

Perpetuating prejudice

Perhaps the most unsettling area where algorithms are increasingly taking over from human decision-making is in law enforcement. Predictive policing systems such as Predpol, which use past data on crimes in order to focus policing resources into certain geographical areas, are used widely in the US.

In a resource-limited world, the idea of reducing crime by spending less money is very attractive,” says Lyria Bennett Moses, a Professor and Director of the Allens Hub for Technology, Law and Innovation at UNSW Law. “And the words that get used to describe predictive policing – objective, scientific, data driven – all have a positive spin.”

But while Bennett Moses admits that these systems can be good at predicting location-based crime such as burglary, a predictive policing system inevitably focuses on where crime is reported, rather than where it happens. 

These tools are less useful at predicting crimes with little location-based correlation, such as domestic violence (which are generally under-reported) or crimes policed in a racially based way, such as those connected with the use of offensive language. She says:

“Offensive language occurs a lot at, say, sporting events or in pubs. But where it actually gets reported is in areas like Sydney’s Redfern, with its large Indigenous population. If you police a community a lot, then you notice more of the crime that happens there. And that’s what goes into the database."

Other algorithmic systems, such as COMPAS, are now creating risk assessment scores that are used in decisions on criminal sentencing and parole. Pro Publica has found that this tool has a higher false positive rate (falsely flagging danger) for African Americans, but it goes beyond racial bias. 

A typical question that a system will ask is, are your parents still married, and if not, how old were you when they divorced? Presently, if an offender provides the ‘wrong’ answer to this, they could get refused parole, because they fall into a group that’s more likely to reoffend if released early. Bennett Moses says:

“The basic idea here is wrong. There are some kinds of decisions where these factors should be ignored, even if they are statistically relevant.

We’re relying on systems that affect people’s lives, so we need to take a step back. It’s not just the machine – racial skewing is not the machine. If the past data suggests that black people commit crimes in certain places (and disproportionately so due to targeted policing) then that’s where the machine will look. It becomes a perpetuator of prejudice."

Human responsibility

So, if the problem is clear, what then of the solution? Can businesses and governments police this themselves?

We need more regulation, and it needs to be well-evidenced and based on research. The new General Data Protection Regulation (GDPR) in the EU is a good model to follow. Google and Facebook are already worried about these laws, so the EU is on the right track. Guler says:

“’Responsible AI’ is a bit of a buzzword at the moment, AI that’s designed to draw in human values. But the problem with this is – which values? Current business leaders in this sector are all North American, and for the most part, white men.

Guler sees a good parallel here with recent discussions over bio-ethics. 

There’s an idea that we should be lining up AI in terms of human rights. We need to think about fairness, transparency, and integrity in decision making.” 

For any decisions made by algorithms, there has to be a human responsibility identified. If algorithms are our future, then understanding, fighting against and preventing algorithmic pollution may save our collective dignity and humanity.


Dubravka Cecez-Kecmanovic is a Professor in the School of Information Systems and Technology Management at UNSW Business School. Her co-authors are Richard Vidgen, also a Professor at UNSW Business School, and Olivera Marjanovic, a Professor at UTS. This article was originally published on Business Think, an alliance partner of Firstlinks.



AI is running ahead of its ethical issues

Innovation wrap: the amazing world of the latest tech trends

Robots and AI will automate workplaces at a frenzied pace


Most viewed in recent weeks

Stop treating the family home as a retirement sacred cow

The way home ownership relates to retirement income is rated a 'D', as in Distortion, Decumulation and Denial. For many, their home is their largest asset but it's least likely to be used for retirement income.

Two strong themes and companies that will benefit

There are reasons to believe inflation will stay under control, and although we may see a slowing in the global economy, two companies should benefit from the themes of 'Stable Compounders' and 'Structural Winners'.

Welcome to Firstlinks Edition 433 with weekend update

There’s this story about a group of US Air Force generals in World War II who try to figure out ways to protect fighter bombers (and their crew) by examining the location of bullet holes on returning planes. Mapping the location of these holes, the generals quickly come to the conclusion that the areas with the most holes should be prioritised for additional armour.

  • 11 November 2021

Reducing the $5,300 upfront cost of financial advice

Many financial advisers have left the industry because it costs more to produce advice than is charged as an up-front fee. Advisers are valued by those who use them while the unadvised don’t see the need to pay.

Welcome to Firstlinks Edition 431 with weekend update

House prices have risen at the fastest pace for 33 years, but what actually happened in 1988, and why is 2021 different? Here's a clue: the stockmarket crashed 50% between September and November 1987. Looking ahead, where did house prices head in the following years, 1989 to 1991?

  • 28 October 2021

Why has Australia slipped down the global super ranks?

Australia appears to be slipping from the pantheon of global superstar pension systems, with a recent report placing us sixth. A review of an earlier report, which had Australia in bronze position, points to some reasons why, and what might need to happen to regain our former glory.

Latest Updates

Investment strategies

Are these the four most-costly words in investing?

A surprisingly high percentage of respondents believe 'This Time is Different'. They may be in for a tough time if history repeats as we have seen plenty of asset bubbles before. Do we have new rules for investing?

Investment strategies

100 tips from our readers for new investors

From the hundreds of survey responses, here is a selection of 100 tips, with others to come next week. There are consistent and new themes based on decades of experience making mistakes and enjoying successes.


What should the next generation's Australia look like?

An unwanted fiscal drain will fall on generations of Australians who have seen their incomes and wealth stagnate, having missed the property boom and entered the workforce during a period of flatlining real wages.


Bank results scorecard and the gold star awards

The forecasts were wrong. In COVID, banks were expected to face falling house prices, high unemployment and a lending downturn. In the recovery, which banks are awarded gold stars based on the better performance?

Exchange traded products

In the beginning, there were LICs. Where are they now?

While the competing structure, ETFs, has increased in size far quicker in recent years, LICs remain an important part of the listed trust sector. There are differences between Traditional and Trading LICs.


Should you bank on the Westpac buy-back?

Westpac has sent out details of its buy-back and readers have asked for an explanation. It is not beneficial for all investors and whether this one works for some depends on where the bank sets the final price.

Investment strategies

Understanding the benefits of rebalancing

Whether they know it or not, most investors use of version of a Strategic Asset Allocation (SAA) to create an efficient portfolio mix of different asset classes, but the benefits of rebalancing are often overlooked.


Six stocks positioned well for a solid but volatile recovery

The rotation to economic recovery favouring value stocks continues but risks loom on the horizon. What lessons can be drawn from reporting season and what are the trends as inflation appears in parts of business?



© 2021 Morningstar, Inc. All rights reserved.

The data, research and opinions provided here are for information purposes; are not an offer to buy or sell a security; and are not warranted to be correct, complete or accurate. Morningstar, its affiliates, and third-party content providers are not responsible for any investment decisions, damages or losses resulting from, or related to, the data and analyses or their use. Any general advice or ‘regulated financial advice’ under New Zealand law has been prepared by Morningstar Australasia Pty Ltd (ABN: 95 090 665 544, AFSL: 240892) and/or Morningstar Research Ltd, subsidiaries of Morningstar, Inc, without reference to your objectives, financial situation or needs. For more information refer to our Financial Services Guide (AU) and Financial Advice Provider Disclosure Statement (NZ). You should consider the advice in light of these matters and if applicable, the relevant Product Disclosure Statement before making any decision to invest. Past performance does not necessarily indicate a financial product’s future performance. To obtain advice tailored to your situation, contact a professional financial adviser. Articles are current as at date of publication.
This website contains information and opinions provided by third parties. Inclusion of this information does not necessarily represent Morningstar’s positions, strategies or opinions and should not be considered an endorsement by Morningstar.

Website Development by Master Publisher.