Accessing personal or business finance could soon become harder for many Australian women, as artificial intelligence is increasingly integrated into financial services, bringing with it an increased risk of inherent biases and discrimination.
Experts are sounding the alarm that the banking and finance sectors’ embrace of new automated decision-making technology could leave women vulnerable to discrimination and more likely to be denied vital loans, as AI amplifies already skewed criteria.
“The status quo is bad and AI is going to take it to the next level. Women are already starting on the back foot, but AI will pour fuel on the fire,” Amanda Rose, CEO of Entrepreneurial & Small Business Women Australia says.
The banking and finance sectors’ embrace of new automated decision-making technology could leave women more likely to be denied vital loans
Leonora Risse, an economist focused on gender equality says that, despite good intentions, AI models can ultimately become heavily skewed.
“These systems aren’t designed to discriminate on the basis of gender, they’re intended to be gender neutral. But once you put them into practice, they end up being gender biased because of the very different experiences and circumstances that on the whole, on average, men and women tend to be in.”
As banks and other financial institutions ramp up their use of AI and automated decision making, those in the field are calling for regulatory safeguards and guardrails to make sure whole segments of society aren’t left behind.
Women have always faced biases and discrimination when it comes to accessing finance.
Kirsten Abernethy is the executive director of the Victorian Women’s Trust, an organisation that was instrumental in changing the Australian banking sector’s practice of requiring a male guarantor for women’s business loans in the 1980s.
These days, she says, the ways lenders assess women with small businesses and female entrepreneurs comes with more implicit biases than overt ones.
“The financial systems are really based around male norms of full-time, uninterrupted work, linear income growth, and the accumulation of assets,” she says.
In contrast, women often have career breaks and might have periods of time where their work is part-time or flexible.
“The models that are used around risk, credit and investment that are used systematically undervalue women, not because women are riskier propositions, but because of what the system assumes is normal,” Abernethy adds.
Women face systematic discrimination from financial institutions because models of risk are based on male norms.
AI accelerating the trend
Michael Mehmet, a professor from the School of Business at the University of Wollongong says banks and other financial institutions are quickly getting on the AI bandwagon, due to advantages in speed and reduced labour costs.
“Every day more and more banks are jumping onto it just purely based on its speed and efficiency and ability to process huge amounts of data without too many human resources,” he says.
“Particularly when it comes to understanding patterns and trends and then feeding that sort of information into lending approvals and helping with strategic decision making.”
When the data itself reflects existing biases that we know are prevalent in society, the AI can amplify that
Ramona Vijeyarasa, University of Technology Sydney
Ramona Vijeyarasa, a professor at the University of Technology Sydney (UTS) who has researched AI and gender, says the risk to women lies within the datasets themselves.
“It’s not that the problem is necessarily the tech, but when the data itself reflects existing biases that we know are prevalent in society, the AI can amplify that,” she says.
In short, AI won’t just replicate pre-existing biases, it will accelerate them, as a result of its increased speed and efficiency.
AI won’t just replicate pre-existing biases, it will accelerate them,
And it’s not just a problem in Australia. Norrin Halilem, professor in innovation and knowledge management at Canada’s Universite Laval, says the concerns are worldwide.
“This is especially relevant because credit decisions sit at the intersection of financial risk and social life,” he says.
“Things like housing, mobility, entrepreneurship, education, and family stability can all hinge on access to credit.”
Jeannie Paterson, director of University of Melbourne’s Centre for Artificial Intelligence and Digital Ethics, was previously on the federal government AI Expert Group. She says banks are being very opaque about how they are using the technology.
“It’s a real issue. It’s really unclear how they make the assessment. If they are making it a partially automated process, there is little that is clear about why people are being denied credit, they won’t know. In law they have little right to complain unless they can show discrimination, and it’s really hard to prove discrimination unless you have access to the data.”
Paterson says it’s important that any automated decision-making process used is audited for systemic discrimination, but it is largely unclear if this is happening.
“Banks won’t even say if they are using it, let alone if they are auditing them.”
Banks remain (mostly) silent
We contacted Commonwealth Bank, NAB, Westpac and ANZ, asking them what AI systems and automated models they were using to assist in decision making, whether these were designed in-house or by third parties, and what auditing and safeguards were being put in place to minimise gender biases.
Only the Commonwealth Bank of Australia responded on the record.
“All credit decisions continue to follow our established practices, risk frameworks, and responsible lending policies,” a CBA spokesperson says.
“Where appropriate, we use industry and internal data and automated processes to make it easier to assess our customer’s situation. We have strong controls and governance to manage model alignment.”
They added that models undergo independent assessments to support “fairness across customer groups” and were subject to ongoing monitoring and reviews.
Paterson says too much of the discussion surrounding AI focuses on so-called “generative AI” and other areas and impacts on society are often ignored.
“All our attention is on ChatGPT and we have been distracted from automated decision making. If a bank is using a credit tool to do automated credit assessments, but if it’s not generative AI, it’s not on the agenda,” she says.
Professor Vijeyarasa from UTS says she would like to see Australia take a more stringent regulatory approach in this area and follow the example of the European Union whose AI Act provides some protections from AI-based discrimination for their citizens.
“It’s fair to say that the regulation of this space hasn’t really kept up. At the moment it is up to the private sector to decide how much information they want to give to us and whether the risk of automated discrimination is too high,” she says.
Jarni Blakkarly is an award-winning Investigative Journalist at CHOICE. Jarni has worked for news organisations such as SBS, Reuters, Al Jazeera English, ABC 730, Radio National, BBC World Service and Deutsche Welle.
Jarni won the Walkley Foundation's young journalist of the year student category award in 2016 and was the recipient of a Melbourne Press Club Michael Gordon fellowship in 2022. In 2023 he was a highly commended finalist in the Quill Awards and a winner at the 2024 Excellence in Civil Liberties journalism awards. In 2024 he was elected to serve on the Federal Council (National Media Section) of the MEAA. Jarni has a Bachelor of Communications (Journalism) from the Royal Melbourne Institute of Technology (RMIT). LinkedIn
Jarni Blakkarly is an award-winning Investigative Journalist at CHOICE. Jarni has worked for news organisations such as SBS, Reuters, Al Jazeera English, ABC 730, Radio National, BBC World Service and Deutsche Welle.
Jarni won the Walkley Foundation's young journalist of the year student category award in 2016 and was the recipient of a Melbourne Press Club Michael Gordon fellowship in 2022. In 2023 he was a highly commended finalist in the Quill Awards and a winner at the 2024 Excellence in Civil Liberties journalism awards. In 2024 he was elected to serve on the Federal Council (National Media Section) of the MEAA. Jarni has a Bachelor of Communications (Journalism) from the Royal Melbourne Institute of Technology (RMIT). LinkedIn
For more than 60 years, we've been making a difference for Australian consumers. In that time, we've never taken ads or sponsorship.
Instead we're funded by members who value expert reviews and independent product testing.
With no self-interest behind our advice, you don't just buy smarter, you get the answers that you need.
You know without hesitation what's safe for you and your family. And our recent sunscreens test showed just how important it is to keep business claims in check.
So you'll never be alone when something goes wrong or a business treats you unfairly.