Amanda Khatri
Editorial Manager
Financial services firms adopting emerging technologies like artificial intelligence (AI) must comply with existing consumer protection regulations, US regulators have warned.
The Consumer Financial Protection Bureau (CFPB), a powerful watchdog, is to monitor firms use of AI to guard against risks associated with emerging technologies.
“Although institutions sometimes behave as if there are exceptions to the federal consumer financial protection laws for new technologies, that is not the case”, said the CFPB in a letter to Treasury Secretary Janet Yellen, outlining its position on AI use.
“There is no ‘fancy new technology’ carveout to existing laws”, added CFPB Director Rohit Chopra.
The CFPB routinely fines financial institutions billions of dollars for breaches of consumer protection laws, and it has warned the misuse of AI technologies can fall into this category.
The rapid adoption of AI within banking has also raised transparency concerns, and the CFPB is investigating whether AI is being used lawfully for aspects of fraud detection, automated customer service and credit underwriting decisions.
As part of its focus on new technologies, the agency recently proposed to subject large technology firms providing digital payments to similar regulatory frameworks and oversight to banks.
No special treatment or exemptions
The agency’s message is clear; there will be no exceptions or special treatment. All regulated firms operating in financial services and using AI or machine learning technologies must abide by existing laws.
It reinforced the message that investments in new technologies and innovations must always be in the interests of consumers.
The agency believes by establishing clear, straightforward regulations, it hopes to encourage firms to focus on innovation rather than seeking out “loopholes” or “special treatment” to circumvent regulatory standards.
AI use cases on the CFPB radar
Here are some of the AI use cases at financial institutions that the CFPB will be looking into:
Automated customer service technology (chatbots)
This includes technologies built using large language models like ChatGPT. These technologies can provide incorrect answers and may not be able to resolve a dispute with a consumer which can increase privacy and security risks.
As more firms adopt these technologies, the CFPB is keeping a close eye on whether they are complying with existing laws like the Equal Credit Opportunity Act, which prevents discrimination when making lending transaction decisions and the Consumer Financial Protection Act which prohibits unfair, deceptive or abusive practices.
Any use of algorithms, machine learning or automated decision-making tools need to ensure they aren’t biased, the regulator said.
Fraud screening
Some businesses are using technologies to screen for fraud and occasionally using third-party vendors to attribute risk scores to consumers.
These firms must ensure consumer financial laws like the Consumer Financial Protection Act and the Equal Opportunity Act are being followed.
Fraud screening assesses whether a customer is worthy of credit as well as approving it, firms using machine learning for this must ensure compliance with the Fair Credit Reporting Act.
Lending and underwriting decisions
Organisations must comply with the Equal Opportunity Act, regardless of the complexity of their technologies, to avoid unlawful discriminatory practices.
If those using AI deny credit or take other actions against a consumer, these businesses must provide accurate and specific reasons explaining why.
“The CFPB will continue to closely monitor and review the fair lending testing regimes of financial institutions, including reliance on complex models”, the regulator said.
CUBE comment
The US regulator has now shared its piece on AI use within financial services. It’s clear that regulators worldwide are addressing the risks associated with AI and other emerging technologies to ensure fairness and transparency of markets and protect consumers against harm.
If your business is using AI, it’s time to get your house in order.
The CFPB’s regulatory approach won’t allow any special exemptions and firms are expected to follow the laws. As with most regulatory standards in the US, if found to be non-compliant you could be slapped with a heavy fine…
The saying “fight fire with fire” applies here - AI-powered technologies when harnessed correctly can significantly reduce AI risks.
CUBE’s Automated Regulatory Intelligence seamlessly automates and streamlines the entire lifecycle of regulatory change.
It uses enhanced machine learning and natural language processing to identify gaps in your regulatory change management programme and alerts the relevant teams to help your business stay ahead of every AI regulatory development.
Get in touch below to find out how CUBE can help.