May 26, 2022 – The Consumer Financial Protection Bureau published a circular undercutting credit decisions based on artificial intelligence or machine learning models.

These complex algorithms or and models are notoriously opaque. While the accuracy of these “black-box” models can often be measured, the machine’s reasoning is generally impossible to articulate. This makes it difficult to prove the machine is not recommending a creditor to unlawfully discriminate and violate ECOA & Regulation B.

The CFPB makes it clear that creditors are not prohibited from using these systems, but creditors must still be able to disclose the specific principal reasons for taking adverse action.

“Whether a creditor is using a sophisticated machine learning algorithm or more conventional methods to evaluate an application, the legal requirement is the same: Creditors must be able to provide applicants against whom adverse action is taken with an accurate statement of reasons. . . . A creditor’s lack of understanding of its own methods is therefore not a cognizable defense against liability for violating ECOA and Regulation B’s requirements.”

Please click the links below for more information:

CFPB Press Release

Consumer Financial Protection Circular 2022-03