Fair Lending & Compliance

Three questions compliance pros have to ask before adopting machine learning in underwriting

Zest AI team
June 1, 2018

Compliance departments often get tagged with the (unfair) reputation that all they do is check boxes, but the reality is that compliance professionals play a critical role in delivering innovation.

They are responsible for clearing paths for new products and technology to meet regulatory standards so that they can be responsibly introduced to the market. Compliance teams have always stepped up to solve banking’s big innovation challenges — from ATMs to automated models to depositing checks by phone.

Machine learning (ML) is one of today’s biggest compliance challenges. Because it applies complex math to learn and adapt to massive amounts of data, the potential impact of ML in credit underwriting is profound. More accurate ML-driven models will expand access to credit to underserved markets by increasing approvals while reducing defaults. But ML does present meaningful concerns, specifically around Fair Credit Reporting Act and Fair Lending compliance.

Machine learning models do not have to be a black box. Today's new responsibility for the compliance pro is to help businesses harness the power of ML in underwriting while also making sure that the new technology fully satisfies current regulatory and compliance frameworks. Compliance teams can lead their companies on the path to get ML safely into the market, if they ask themselves the right questions to get up to speed on the technology and sort out fact from fiction.

The questions don’t stop there, and each of these questions require a deeper dive. Similar to how compliance officers have weathered the UDAAP storm with diverse teams, we will need to lean on our internal modeling and validation teams for education while following the old “trust but verify” adage.

1. Are the models fully explainable to ensure compliance with existing regulations such as FCRA (Regulation V) and ECOA (Regulation B)?

Yes, the algorithms and approaches can be fully explainable. The math involved to achieve full explainability is really hard, but it can be achieved and be made detailed, accurate, and understandable to facilitate compliance with regulatory requirements such as adverse action and Fair Lending.

2. Are you using alternative modeling and/or alternative data?

Alternative data and alternative modeling are not one and the same. You can use alternative data with existing regression models. You can use alternative modeling on the same data you use today. The two often get incorrectly lumped together since alternative modeling helps enable the adoption of large datasets that often include alternative data, but they do not have to come as a pair.

3. Do the machines learn on their own or update themselves or are humans involved?

The industry may never welcome online learning, a term of art referring to machines learning autonomously. For the foreseeable future, human input and control are required to make sure that the models are built safely and responsibly. Left on their own, models can actually learn the wrong thing and create business risk. Humans are required to make sure that the models are learning the right things so that they can be deployed into production both confidently and safely.

Rather than look skeptically at what machine learning can do or tagging it as black-box tech, compliance teams can embrace this powerful new technology to once again prove their innovation chops, while satisfying all existing compliance and regulatory requirements.

As compliance officers, your work may never be done, but the latest technologies that make your job dynamic can also make the work more efficient.

latest
April 18, 2024
Redefining financial literacy through innovation and community
April 9, 2024
Learning from nature — you must water and prune a plant for it to grow
March 28, 2024
Innovation In Lending
Looking beyond market pain points to find purpose