How companies can manage the risks in handling alternative credit data
July 27, 2022 - Is a consumer with a FICO score below 620 actually a greater risk? How would you know? Evidence suggests originators can no longer rely solely on FICO scores to identify subprime borrowers or those with weaker credit prospects.
Alternative data, such as bank account/cash flow, rental payment history, professional licensing or education information, along with machine learning and artificial intelligence, are available to help gauge credit risk more accurately. But the use of alternative data carries particular legal risks. As credit performance softens across markets, and, in particular, for non-prime auto loans, non-compliance with law in the origination process could become the basis of claims by investors and other parties, if they start to incur losses.
Below are some key issues to address and practical tips for safely getting the most out of new data and technology.
<b>The credit risk riddle</b>
The subprime borrower fared well financially during the pandemic.
Government stimulus put on average $5,000 in Americans' pockets. For those who qualified there was, for a time, additional Federal Pandemic Unemployment Compensation. As a result, the personal savings rate climbed to 33.8% in April 2020.
But it all seems to be unwinding. Unemployment has recently returned to pre-pandemic levels of 3.6%, and the savings rate has plummeted to 4.4%. Inflation is eating into consumers' savings and eroding their confidence.
Rising interest rates are making both financed vehicles and homes, two items that already saw major price increases through the pandemic, even more expensive. Vehicle depreciation rates are expected to return to their normal pace as inventory builds, creating the potential for consumers to have negative equity in vehicles bought at the market peak, and making new purchases more expensive.
Will alternative data make it easier for originators to solve the credit risk riddle? Not so fast. Credit risk may be reduced as consumer profiles come into sharper focus, but legal risks, from Consumer Financial Protection Bureau (CFPB) enforcement to investor claims must be addressed.
<b>Legal risks: discrimination</b>
Alternative data has been recognized as helping in underwriting decisions. Lenders can get a fuller picture of a borrower to accurately price risk and, in turn, borrowers can hopefully obtain a better deal.
In any credit transaction, fair lending is a central concern. The Equal Credit Opportunity Act (ECOA) and its implementing Regulation B protect against discrimination in credit transactions on the basis of race, color, religion, national origin, sex, marital status, age, public assistance status or exercising rights under consumer protection laws.
The definition for "credit transaction" under the statute is broad and includes every aspect of an applicant's dealings with a creditor regarding an application for credit or an existing extension of credit. This has been viewed to capture Fintechs that provide credit scores or credit assessment tools.
Credit transactions that result in disparate treatment of consumers and/or have a disparate impact are prohibited. Disparate treatment has been found where a creditor treats members of a protected class of people differently than other applicants. But even if there is no intent to discriminate, and the creditor uses a facially "neutral" policy or practice, if the result disproportionally excludes or burdens certain groups with no justifiable business necessity, it could be actionable.
Under the ECOA, creditors must provide "adverse action" notices to explain why a borrower received an unfavorable credit decision (including, for instance, where they were denied credit, had their credit revoked, or their existing credit arrangement has changed). These notification requirements are intended in part to prevent discrimination by forcing creditors to explain their decisions. Further, the notices provide the consumer with some sense of how they could improve their credit situation by changing future behavior or habits.
The CFPB has taken a keen interest in the role of AI in adverse actions, issuing a circular in 2022 that made clear notice requirements "apply equally to all credit decisions, regardless of the technology used to make them…" The law does not permit creditors to use complex algorithms as a shield against providing detailed, accurate reasons for adverse actions.
In 2021, the CFPB sued a Fintech over allegations of fair lending violations and illegal and deceptive marketing practices. The CFPB accused Fintech, LendUp, of failing "to provide adverse-action notices within the 30 days," as is required by the ECOA, for "over 7,400 loan applicants." The complaint also alleged LendUp "failed to accurately describe the principal reasons why LendUp denied the applications."
The CFPB settled the litigation (with LendUp not admitting to liability). LendUp agreed to cease loan operations and pay a penalty.
Going forward, as AI and machine learning become more prevalent, there is an expectation that the CFPB will take action here and issue a rulemaking.
Companies that are viewed as creditors may avoid running afoul of fair lending laws regarding discrimination by:
•testing their systems for potential discriminatory classifications as well as discriminatory impact;
•making sure they have the ability to explain adverse actions generated by AI systems; and
•providing adverse action notices in a timely fashion with these explanations when required.
<b>Legal risks: credit information collection and use</b>
Regulators will also be vigilant around how credit information is collected, shared, and used. The Fair Credit Reporting Act (FCRA) and its implementing Regulation V regulate Credit Reporting Agencies (CRAs or "consumer reporting agency") and third-party furnishers of credit data. But even users of credit information have statutory obligations here as well.
Entities collecting alternative data for credit purposes and furnishing it to others may fall under the broad definition of consumer reporting agency. Credit information collected/furnished/used may fall under the similarly broad definition of a "consumer report" under the statute.
Further, if you are considered a consumer reporting agency, you may only provide a consumer report in limited circumstances to users. The user must intend to use the information in connection with a credit transaction, employment, insurance, or consumer's eligibility for a license. Users must confirm they are only requesting for these limited purposes (and requests for this data made under false pretenses are subject to fines and jail time).
Recently, the CFPB put out an advisory opinion on permissible purposes. They illustrate the legal risks of CRAs employing insufficient name-matching policies (e.g. including providing information on "possible matches" for multiple people when there was only a permissible use for one person).
Furnishers of data to consumer reporting agencies have obligations under the FCRA to report accurate information and must correct and update information when they know it is incomplete or inaccurate.
Here, too, consumers must be notified when there is an "adverse action" related to a credit decision (using the same definition as ECOA). Even if the adverse action is taken based on information obtained from other than a CRA, if it bears upon the consumer's credit worthiness/ standing/capacity or other relevant areas the user may also have an obligation to notify the consumer.
Companies may safely deal with credit information by:
•understanding whether their handling of alternative data may result in them being considered a consumer reporting agency, a furnisher or a user of a consumer report;
•confirming their use of credit information is for a permissible purpose;
•making sure they have the ability to explain adverse actions generated by AI systems; and
•providing adverse action notices in a timely fashion with these explanations when required.
<b>Legal risks: unfair and deceptive acts and practices</b>
As always the FTC and CFPB, under their governing authority, will also act where there are unfair, deceptive, or abusive acts and practices.
For example, in 2017, the FTC pursued a company for collecting data under the pretense that it would be used to match consumers with lenders and identify the lender with the lowest interest rate. In reality, the company was merely a lead-generation business selling consumer data to mainly non-lenders.
Companies may stay clear of claims of deceptive practices by:
•representing to consumers the purpose for which they are collecting data;
•ensuring they are acting in accordance with their representations; and
•updating and modifying their representations to maintain their accuracy when business practices change.
<b>Looking ahead</b>
Alternative data should be a boon for consumer markets at this time of economic uncertainty, helping originators accurately assess credit risk and provide greater access to credit across consumer profiles. And yet, concerns the CFPB and FTC will act with too heavy a hand may keep some from taking full advantage of its benefits.
The practical tips above can help participants overcome concerns of heightened scrutiny related to fair lending and the use of AI as part of an overall review of practices, policies, and systems to ensure compliance with the law.
Adhering to these best practices may also help avoid violations of law that would otherwise become the basis of claims of misrepresentations by investors and counterparties.
Joseph Cioffi is a regular contributing columnist on consumer and commercial financing for Reuters Legal News and Westlaw Today.