Pub. 3 2023 Issue 5

If your financial institution wants to take advantage of the latest innovations in AI, what steps need to be taken to ensure there are no ECOA violations? The federal government has provided instruction to designers, developers and deployers of these technologies to protect against algorithmic discrimination: “Designers, developers, and deployers of automated systems should take proactive and continuous measures to protect individuals and communities from algorithmic discrimination and to use and design systems in an equitable way. This protection should include proactive equity assessments as part of the system design, use of representative data and protection against proxies for demographic features, ensuring accessibility for people with disabilities in design and development, pre-deployment and ongoing disparity testing and mitigation, and clear organizational oversight. Independent evaluation and plain language reporting in the form of an algorithmic impact assessment, including disparity testing results and mitigation information, should be performed and made public whenever possible to confirm these protections.”3 As a financial institution utilizing these technologies, it will be crucial for your institution to conduct appropriate due diligence on the technology service provider, which should include a review of the third party’s algorithmic impact assessments and should include disparity testing results and mitigation information. The federal regulatory agencies made it clear in their June 9 Interagency Guidance Shelli J. Clarkston is an Of Counsel attorney in the Kansas City, Missouri office of Spencer Fane, LLP. She can be reached at (816) 292-8893 and sclarkston@spencerfane.com. on Third-Party Relationships: Risk Management publication that, especially when using new technologies, financial institutions have heightened responsibilities, given the increased risk of such technologies and third-party relationships, to ensure the technologies being provided comply with applicable laws and regulations. Failure to complete a thorough due diligence review will very likely result in serious negative consequences, especially if it is discovered that the technology results in algorithmic discrimination. ■ 1. Consumer Financial Protection Bureau, CFPB and Federal Partners Confirm Automated Systems and Advanced Technology Not an Excuse for Lawbreaking Behavior, April 25, 2023. 2. See Joint Statement on Enforcement Efforts Against Discrimination and Bias in Automated Systems. 3. The White House, Algorithmic Discrimination Protections, Blueprint for an AI Bill of Rights, August 22, 2023. The Show-Me Banker Magazine | 15

RkJQdWJsaXNoZXIy MTg3NDExNQ==