By Kathleen Ryan
Artificial intelligence, machine learning and alternative data are increasingly touted for improving fraud detection, compliance, credit underwriting, and banking operations in general. For credit underwriting, AI and alternative data may result in more accurate underwriting decisions and may help to expand credit to consumers without full credit histories as measured by current practices.
As banks consider incorporating AI and alternative data into their operations and decision-making processes, they must weigh the anticipated benefits against the untested nature of these technologies, the fair lending risks involved with using complex “black box” algorithms and the new data that has not previously been used in credit decision-making.
Applying half-century-old laws like the Fair Housing Act and the Equal Credit Opportunity Act to new technologies can be challenging. When the laws and related guidance were written, consumers applied for credit at a local bank branch, and underwriting and pricing decisions were primarily based on manual processes. Then, the primary fair lending concern was discriminatory policies or bank employees who would let individual bias creep into decision-making processes. Today, regulators encourage innovation, but they express concerns that AI and machine learning could have hidden biases. Moreover, the sheer number of attributes considered by advanced systems might result in unintentional discrimination against protected classes.
Alternative data gets a few careful nods from regulators
Alternative data include rent and utility payment history, educational attainment, social media use and other behavioral information not traditionally factored into credit decisions. These data can help lenders assess the creditworthiness of consumers who may lack experience with traditional forms of credit, such as credit cards. However, uncertainty about how regulators view alternative data have made lenders hesitant to incorporate alternative data into their decision-making.
While regulators clearly expect banks to use new technology without violating legal standards, it is less clear what level of scrutiny a bank must apply to emergent technologies and how it can possibly review alternative data sets that may involve thousands of data elements. To add complexity to a bank’s due diligence, AI and alternative data sets may be the intellectual property of vendors or other third parties. Banks may not even have access to the black box. If a bank has access to the underlying technology, it may lack in-house expertise to fully analyze the models and data used.
Recently, however, regulators have offered a few guideposts that may help banks define an approach to putting some of the new solutions to work. In December 2019, the OCC, FDIC, Federal Reserve, NCUA and the CFPB issued an interagency statement…
Read More: The Big Brain in the Black Box