Finance Property ‘Alarming’: Postcodes being used to determine credit scores
Updated:

‘Alarming’: Postcodes being used to determine credit scores

Postcode data housing
Your neighbour's dodgy bill-paying habits could be dragging down your credit score. Photo: TND/Getty
Share
Twitter Facebook Reddit Pinterest Email

Your current address could prevent you from buying your dream home after a major credit reporting agency started including postcode data in its application assessments.

Equifax produces credit scores, which are used by banks to help determine a loan applicant’s creditworthiness.

Applicants with high credit scores find it easier to access loans with lower interest rates, while people with low credit scores are slugged with higher interest rates or rejected outright.

Experity director Clint Howen said using someone’s postcode to determine whether they could afford a home in another area was a “horrible idea”, as it meant the behaviour of the applicant’s neighbours could affect their ability to get a loan.

Mr Howen said banks take into account the postcode of an applicant’s prospective home when determining the riskiness of the loan so they can price in how future changes in the local property market, such as the simultaneous construction of many similar units, for example, could affect the value of the home being bought.

But Mr Howen said he had never heard of an applicant’s current address affecting their credit score.

He said including an applicant’s postcode data in their credit score evaluation was a way for the credit reporting agency to assess their risk as borrowers based on the history of the people they live near.

This means even if an applicant has a steady income and strong savings habits, their credit score could be tarnished if they lived in a low socio-economic area with people who struggle to pay their bills on time.

“So they’re basically saying, ‘Look, if someone comes from this area, they’re more at risk of defaulting on their credit based on all the other people that live in this area that have defaulted on their credit’,” Mr Howen said.

“But it should be [based] on the individual and their circumstances.”

Local area ‘irrelevant’

Consumer Action Law Centre CEO Gerard Brody said information that speaks to your ability to make repayments should be available to lenders to assess loan applications – but your current address is irrelevant.

“It’s not a new thing for banks and finance companies to access personal and transaction data, and use complex systems, to predict an individual’s behaviour,” Mr Brody said.

“Location data or postcode does not tell a lender how likely the customer is to find load repayments affordable.”

In a statement to the The New Daily, an Equifax spokesperson said the postcode data is typically only used in the absence of more traditional credit reporting information and normally “has no impact or a positive impact on an Equifax Credit Score”.

They said the “geodemographic data … can form a small component of Equifax credit scores in circumstances where there is very limited credit information available”.

“This is relevant for people with thin credit files where there is an absence of credit reporting information, such as inquiries for credit or repayment history information,” the spokesperson said.

“This occurs where a credit file is created for the first time or if an individual’s credit file has been inactive for more than five years.”

The two biggest contributing factors to someone’s credit score is their repayment history information and their current and past applications for credit, the spokesperson added.

In only 1.4 per cent of cases does the “geodemographic data” appear as one of the top four key contributing factors or influences of the score.

When tech backfires

The inclusion of postcode data in Equifax’s credit score evaluations is being done with automated decision-making, using algorithms and artificial intelligence programs.

Automated decision-making is commonly used in the US to determine people’s access to housing, employment, education and health care.

Choice senior campaigns and policy adviser Amy Pereira said she was “alarmed” that it was now being used in Australia and putting people in arbitrary boxes.

“At the moment there are so many barriers to getting a home, particularly for first-home buyers,” Ms Pereira said.

“We know that postcode data is often linked to other characteristics about a person, like their income, their wealth, their education level – the list goes on.

“So it’s alarming that people are basically being put in these boxes by Equifax, who is using these automated decision-making tools to dictate where someone could live in the future.”

The main problem with automated decision making is the lack of transparency.

When automated decision making is used, both lenders and applicants are left in the dark about how exactly a credit score has been determined, Ms Pereira said.

“It’s a huge issue where we really need more transparency and accountability from companies which use automated decision making, because it really can impact people’s lives, and they have no control or visibility over those decisions,” she said.

Ms Pereira said Australia needs stronger regulations on automated decision making to make sure companies are using it ethically and consumers can challenge the decisions based on it.

Relying too much on algorithms also runs the risk of “dehumanising” decision making, Federal Court justice Melissa Perry and associate Sonya Campbell said in a speech on artificial intelligence and automated decision making in October.

“Human variables such as empathy, compassion, competing values and the availability of mercy cannot be replicated by machines,” Ms Perry and Ms Campbell said.

Ms Perry and Ms Campbell used Centrelink’s failed robodebt program as an example.

The automated program asked people to pay back debt they didn’t actually owe, leading to a $1.2 billion class-action lawsuit.

The program was ruled unlawful in 2019 after the Federal Court found Centrelink could not have been satisfied the debts calculated automatically were correct.