Despite increasing transparency in the credit scoring industry, we still don’t know much about what lenders are actually looking for – and, specifically, what they’re looking at – when they evaluate our credit card applications and set key terms, such as our APRs and credit limits.
Thanks to free credit scores and widely available credit education tools, we know a lot these days about what goes into a traditional credit score, such as a FICO score or VantageScore.
The savviest borrowers know, for example, that limiting debt and maintaining a low credit utilization ratio is a key component of a good credit score. So is a lengthy and robust credit history.
But despite increasing transparency in the credit scoring industry, we still don’t know much about what lenders are actually looking for – and, specifically, what they’re looking at – when they evaluate our credit card applications and set key terms, such as our APRs and credit limits.
As a result, borrowers with identical incomes and similarly high credit scores could still be assigned vastly different credit card terms, even if they share the same traditional metrics, such as a perfect payment history or a similar income and housing expenses.
(Although traditional credit scores don’t take into account income or housing in their calculations, many credit card applications require you to disclose this information on your application.)
See related: My credit score’s 750! Why was I denied a card?
Apple Card’s alleged gender bias highlights lack of visibility
In fact, some borrowers with impressive credit scores may even be awarded worse terms – such as a lower credit limit or a higher APR – than another borrower with a lower score.
That’s what happened to the programmer David Heinemeir Hansson and his wife when they applied for brand-new Apple Cards. In a lengthy tweet thread that went viral in November 2019, Hansson claimed that his wife was given a substantially lower credit limit than he was given, even though her credit score was higher and they shared the same assets and household income.
When Hansson’s wife followed up with Apple, customer service agents kept pointing to the company’s scoring algorithm. However, they were unable to share more details about what went into the algorithm or why it appeared to be giving her a lower limit than her husband’s.
When lenders or credit scoring companies create an algorithm to evaluate potential borrowers, they use a variety of data points, such as your payment history or the average age of your accounts. But not all scoring algorithms use the exact same data.
Some lenders, for example, will look at other, nontraditional data points if they’re available, such as public records information or cellphone payments. In addition, lenders and credit scoring companies use their own, proprietary techniques to crunch the data and generate a score.
Hansson alleged that Apple’s technique for scoring potential customers had somehow led to gender bias. A computing expert himself, he pointed out that Apple and issuing bank Goldman Sachs’ credit decisions were being made – at least in part – by a “black box algorithm that 6 different [representatives] across Apple and [Goldman Sachs] have no visibility into.”
The ‘black box’ problem
Algorithms that have been developed with machine learning methods – a popular computer science technique that can train a computer to independently evaluate data and make decisions – are often criticized for being “black boxes,” making it difficult to interpret the reasoning behind their conclusions.
According to my husband, who works in data science, the techniques for building this kind of algorithm are relatively easy to understand. But explaining to someone why it resulted in a particular score is difficult.
Creators may know what data goes into a computer model (such as a credit scoring model), but they don’t necessarily fully understand the algorithm’s complex decision-making process and why it (meaning the computer) thinks the score it generates will lead to a better outcome for the lender.
Hansson’s complaint – and the public conversation it sparked – prompted the state of New York’s Department of Financial Services to open an investigation into Apple and Goldman Sach’s scoring algorithm and suss out whether it’s violating anti-discrimination laws.
In a Medium post announcing the investigation, Superintendent Linda A. Lacewell noted that there’s a “black box problem” in consumer credit decisions that can make it tough to determine why some people are being awarded vastly different terms.
“Consumers have little visibility into why a decision is made, or why they have been rejected,” Lacewell wrote.
Alternative data used for more than just thin-credit-file applicants
When a consumer is rejected for credit, lenders are required to send adverse action notices, along with any credit scores that were used to make the decision and brief explanations of what negatively affected the scores.
However, lenders don’t have to send adverse action notices if you’ve simply been awarded the maximum APR listed on a credit card’s terms and conditions page or if you were given a lower credit limit than another customer.
Lenders don’t have to disclose all the information that they are using to evaluate you either. As a result, you could be surprised by the information you’re being judged on.
Take, for example, the case of Joseph, an entertainment executive from California with an 820 FICO score who was rejected for an Alaska Airlines credit card. Joseph was profiled in the Los Angeles Times in 2016 after he received an adverse action notice flagging a credit score he had never heard of (in this case, a Credit Optics score from ID Analytics).
According to ID Analytics, Credit Optics scores cull data from a variety of alternative sources, such as publicly available checking account information and everyday bill payments.
Credit scores that use alternative data are publicly marketed as a solution for evaluating potential borrowers who don’t have enough credit history to generate a traditional credit score. However, it appears that at least some lenders are using alternative data to evaluate many different types of applicants – not just those with thin credit histories.
According to a recent survey by Experian, 65 percent of lenders said they supplement traditional credit data with alternative data when evaluating potential borrowers. Meanwhile, many lenders told Experian that they plan to expand the types of data they evaluate.
What this means for you
It’s still crucially important to tend to your traditional credit scores, such as your VantageScore and FICO, and maintain healthy credit habits.
However, be aware that the information in your credit card applications and in your traditional credit reports from Experian, TransUnion and Equifax may not be the only bits of data you’re being evaluated on.
If you have the time and resources, you have the right to request copies of other consumer reports so that you can gain more insight into what’s being shared about you by others.
But fair warning: There are a lot of them and it can be a pain to request copies.
If you’d like to try, the CFPB has put together a recently updated list of the consumer reporting agencies that may be collecting your information and helping generate different consumer scores. It also includes details about how to contact the agencies and request your reports.
It’s not you, it’s the algorithm
Finally, if you do find out that someone with a similar income and credit profile is being offered better credit terms than you are, don’t take it too personally: It’s possible that even your lender doesn’t know exactly why you were given a higher APR or lower credit limit than someone else.