CFPB to keep close eye on algorithmic decision tools in lending to ensure religion, faith not part of outcome

Institutions will face consequences for robo-discrimination

Closely monitoring the use of algorithmic decision tools in lending by financial companies  – particularly with regard to the religion or faith practiced by a borrower – is on the radar of the consumer financial protection agency, according a blog entry posted Friday.

The post on the website of the Consumer Financial Protection Bureau (CFPB) described the decision tools as being often “black boxes” with little transparency. “Institutions will face consequences for this type of robo-discrimination,” the blog entry stated, posted by the bureau Acting Assistant Director of Supervision Examinations Lorelai Salas.

The blog post states that the bureau is concerned some financial companies are “unlawfully considering religion when making decisions on financial products.” The blog post notes that the Equal Credit Opportunity Act covers small business loans in addition to financial products for family or household use.

“We recently reported that our examiners found that lenders violated fair lending law by improperly inquiring about small business applicants’ religious affiliation and by considering an applicant’s religious affiliation in the credit decision,” the blog post states. “For religious institutions applying for small business loans, lenders utilized questionnaires which contained explicit inquiries about the applicant’s religious affiliation. CFPB examiners determined that lenders also denied credit to applicants identified as a religious institution because the applicants did not respond to the questionnaire.”

The bureau said that lenders, responding to the findings, updated the questionnaires to ensure fair lending compliance. The lender also, the agency said, identified affected applicants and provided an offer for each identified applicant to reapply for a small business loan.

The blogger said the agency is particularly concerned about how financial institutions may be using artificial intelligence or the algorithmic decision tools. “For example, let’s say a lender uses third-party data to analyze geolocation data to power their credit decision tools. If the algorithm leads to an applicant getting penalized for attending religious services on a regular basis this could lead to sanctions under fair lending laws.”

CFPB blog post: It’s illegal to penalize borrowers for being religious