Companies using complex algorithms for credit models – sometimes called “black box” – must provide specific and accurate explanations for denying applications for credit, the federal consumer financial protection agency said Thursday.
In a release, the Consumer Financial Protection Bureau (CFPB) said it published a circular reminding the public, “including those responsible for enforcing federal consumer financial protection law,” of creditors’ adverse action notice requirements under the Equal Credit Opportunity Act (ECOA).
The agency said the circular confirms that federal anti-discrimination law requires companies to explain to applicants the specific reasons for denying an application for credit or taking other adverse actions, even if the creditor is relying on credit models using complex algorithms.
The agency said its circular makes clear that federal consumer financial protection laws and adverse action requirements should be enforced regardless of the technology used by creditors. It also clarifies, the agency said, that creditors cannot justify noncompliance with ECOA based on the mere fact that the technology they use to evaluate credit applications is too complicated, too opaque in its decision-making, or too new.
The agency noted that the ECOA does not permit creditors to use technology that prevents them from providing specific and accurate reasons for adverse actions. “Creditors’ use of complex algorithms should not limit enforcement of ECOA or other federal consumer financial protection laws,” the agency stated.
Use of complex algorithms– including artificial intelligence or machine learning technologies – in credit decisions still requires disclosure of specific, principal reasons for taking adverse actions, CFPB said. “There is no exception for violating the law because a creditor is using technology that has not been adequately designed, tested, or understood,” the agency said.