Threat of “zip code discrimination” because credit scores are skewed by where you live

Where you live could determine whether you can get a mortgage, with Australia’s biggest credit bureau now applying postcode data when assessing applications.

Equifax holds information on nearly 20 million Australians and the ‘credit scores’ it produces are used by banks to determine whether they will provide a home loan.

Confirmation of Equifax’s use of zip code data is another barrier to social mobility — people improving their lives — because it’s harder for people in poor areas to access credit.

“It’s a bit like the rich getting richer and the poor getting poorer,” said Victoria Coster, founder of Credit Fix Solutions, a company that helps people get finance.

Victoria Coster, founder of Credit Fix Solutions, warns that residents of low-income postcodes may find it harder and more expensive to get loans.(ABC News: John Gunn)

Now or never

One of the elements now used to assess whether people are a good credit risk is the behavior of their neighbors when they have been lent money.

“Where you live now shouldn’t affect where you can buy a house in the future,” said Amy Pereira of consumer advocacy organization CHOICE.

Amy Pereira
Amy Pereira worries that people don’t understand the impact ADM or automated decision making is playing in their lives.(ABC News: Daniel Irvine)

She worries that using postcode data to determine whether people should get credit could lead to discrimination.

“We’re concerned that using postcode data to determine your creditworthiness really does present an invisible barrier.”

Equifax has only recently started using “geo-demographic data,” such as how people relate to credit in a particular area, to assess individuals’ credit scores.

In a statement, the company said the information is used “to form a small component of Equifax’s credit ratings in limited circumstances, as it has been found, based on statistical analysis, that it is is a relevant factor in determining credit risk”.

Big data is influencing decisions

It’s part of a larger problem: the explosive growth in the use of automated decision-making, or ADM, where artificial intelligence (AI) algorithms and programs are used to make decisions instead of humans.

Tensions in this area occupy the mind of Lyria Bennett Moses, director of the UNSW Allens Hub for Technology, Law and Innovation.

Lyria Moses
Lyria Bennett Moses says there must be definitions of what fairness is in AI systems before they are deployed.

This is not a theoretical discussion.

By using loyalty cards, online shopping, and credit, consumers create tons of linked data about their spending and lifestyle.

This information now goes into credit decisions.

A credit score over 700 is generally considered good, but mortgage brokers told the ABC that algorithmic customer scoring means only one transaction using a buy now, pay later service (BNPL ) like AfterPay or Zip can see 50-100 points cut. at once.

“There is the potential to use AI [artificial intelligence] systems fairly according to criteria that we specify as fair,” said Professor Moses.

“We can actually specify what we mean by fair and what criteria the systems have to meet before deployment.”

But using postcode data adds another opaque layer, as the behavior of your neighbors now affects decisions about you.

“One of the arguments against [using data from] postcodes are often very correlated with things like ethnicity or if you were born in Australia or an immigrant and so on,” Professor Moses warned.

American experience

The use of automated decision making in finance has exploded due to its lower cost. But it’s not without problems.

In 2019, the US financial regulator investigated Apple over what was described as its “sexist” credit card, which used an algorithm that appeared to be biased against women.

Tech entrepreneur David Heinemeier-Hansson has complained that his Apple card gave him a credit limit 20 times that of his wife. (Ironically, she had a better credit score).

The issue was confirmed by Apple co-founder Steve Wozniak, who shares all bank accounts and assets with his wife. But he was given a card with a credit limit 10 times his own.


Automated decision making should herald the end of discrimination in finance as it has the potential to remove human biases.

Throughout history, different ethnic and disadvantaged groups have struggled to access funding.

But that assumes machines make better decisions and that’s not necessarily true.

Not always smarter

Computers and algorithms rely on two things, according to Jeannie Paterson of the University of Melbourne’s Center for Artificial Intelligence and Digital Ethics.

The woman sits on the chair
Jeannie Paterson, from the Center for AI and Digital Ethics at the University of Melbourne, worries about the explosion of ADMs linked to financial decisions.(ABC News: Billy Draper)

There are the people who create them and the information that informs them.

“What we need to understand is that automatic decision-making does not mean that there is a superhuman machine, a machine with superhuman powers that knows everything about us and can make an accurate prediction about each individual,” he said. said Professor Paterson.

Humans use “rules of thumb” and a matrix of factors to make decisions, like deciding whether to lend people money.

Automated decision making is just a sophisticated statistical process whose quality depends on the information that is fed into it.

Equifax adds data on people’s zip codes because it believes this is a relevant factor in making a decision about credit scores.

“There is no transparency for the consumer, but there is also potentially no transparency internally,” Professor Paterson noted.

“If we automate a process that people don’t really understand, then – because we don’t know what’s going on – certain cohorts, certain groups of people might be much worse off.”

Not the only factor

Equifax emphasizes that using zip code data helps make better decisions.

In a statement, the company said it was a factor in people’s credit scores, but far from the main one.

“Information such as an individual’s credit history and behaviors have a much greater impact on determining an individual’s Equifax credit score,” he said.

These “behaviours” include the types of credit they have applied for, the type of lenders they have used, the frequency with which they do so, the number of open accounts they have and the limit of those accounts, if they make monthly repayments on time and any previous repayments. by default.

“It is important to note that Equifax credit scores are only part of the information a lender will use to assess a credit application,” the company’s statement continued.

“Each lender can apply their own lending criteria and policies, and in some cases their own scores, which is why some lenders may approve an application while others may not.”

Destiny by domicile

The explanation does not appease Victoria Coster. For customers with marginal or problematic credit scores, a difference of 20 to 30 points due to where they live could be the difference between getting a loan or not.

“And what our brokers who refer to us are telling us is that unless a consumer is sitting at (a credit score) around 700 – which is quite high – they cannot access at better interest rates and more traditional lenders,” she said. .

“And when it comes to things like personal loans, for example, you actually don’t have a chance unless you have a really good credit score to get financing down the line.

“These new AI systems that credit bureaus have put in place are unfair when it comes to demographic judgment.

About Hubert Lee

Check Also

A shortage of OB-GYNs is looming. Why are they fleeing NJ?

The young doctor wanted to pursue a career in New Jersey. Dr. Matilde Hoffman completed …