In May, we’ll spend a special themed month discussing where the mortgage market is headed and how technology and business strategies are evolving to meet the needs of today’s buyers. and finance. A new series of prestigious awards called Best of Finance also debut this month to honor leaders in the field. Or subscribe to Mortgage Brief for weekly updates year-round.
Advances in artificial intelligence could help mortgage lenders assess the creditworthiness of the millions of credit blind borrowers who were previously unable to assess their creditworthiness, but unless closely monitored, unique The bias of the bank can also perpetuate redlining and other discriminatory lending practices.
This is the view of the experts participating in BAD INPUT, a new video series aimed at raising public awareness of the impact of emerging AI technologies.
This series by filmmaker Alice Gu explores how bias in algorithms and datasets can unintentionally harm mortgage lending, healthcare, and facial recognition technology.
“Education of the public about these risks and their impact on communities of color is the first step in advocating for industry oversight, accountability, and the creation of more inclusive and equitable products.” said Kapor Foundation’s Lili Gangas when announcing Tuesday’s release of BAD. input.
The Kapor Foundation backed the project and partnered with Consumer Reports to produce the series as part of the Equitable Technology Policy Initiative. Since launching in November, the initiative has contributed more than his $5 million in funding to more than a dozen organizations, including the Algorithmic Justice League and the Distributed AI Research Institute (DAIR) Institute.
“We need a human being involved early on to make sure the data itself isn’t biased,” says attorney Jason Downes in the BAD INPUT segment on mortgages.
Downs, a partner at Braunstein Law Firm who serves as lead counsel for clients facing law enforcement action, said humans should be involved in auditing algorithms on a regular basis.
“So I don’t think technology is necessarily the solution,” says Downs. “I actually think it’s human intervention.”
BAD I’m talking to INPUT.
“You cannot have underwriting in the digital age or equity tools in the stone age,” says Saleh. “Bias detection answers the question, ‘Is my algorithm fair? If not, why? Bias correction answers the question, “Could my algorithm be fairer?” What financial impact will being fairer have on my business?”
Another important question for lenders, Saleh said, is, “The people who turned them down, have you reviewed them?”
This segment also features the perspective of Melissa Koide, CEO and Director of FinRegLab, a non-profit research center. Michael Akinwumi, Head of the Tech Equity Initiative at the National Fair Housing Alliance. Timnit Gebru is a former Google executive who founded and leads the Decentralized AI Lab (DAIR). And his Vinhcent Le, senior legal counsel at The Greenlining Institute, said:
The release of BAD INPUT’s mortgage segment is timely, with four federal agencies last month reporting that technology touted as “artificial intelligence” and promising to remove bias from decision-making still “results in unlawful discrimination.” We have notified the lender that it may create ”
Last year, the Consumer Financial Protection Bureau told lenders that if they couldn’t explain why they made a decision to reject a borrower because the technology they were using was too complex, it wouldn’t be a defense against allegations of discrimination. I warned you.
The CFPB is also working with federal regulators to develop rules aimed at protecting homebuyers and homeowners from algorithmic bias in automated home valuations and appraisals.
Get Inman’s Mortgage Brief Newsletter delivered directly to your inbox. A weekly roundup of the world’s biggest news on mortgages and financial results, delivered every Wednesday. Click here to subscribe.
Email Matt Carter