Accessibility links

Breaking News
USA

US Agency Watching AI in Business


FILE: Graphic representing the business-focused "Enterprise Generative AI Summit" in Silicon Valley in California. Uploaded April 20, 2023.
FILE: Graphic representing the business-focused "Enterprise Generative AI Summit" in Silicon Valley in California. Uploaded April 20, 2023.

WASHINGTON - As concerns grow over increasingly powerful artificial intelligence systems like ChatGPT, the U.S. Consumer Financial Protection Bureau [CFPB], the nation’s financial watchdog, says it’s working to ensure that companies follow the law when they’re using AI.

There will be no “AI exemptions” to consumer protection, regulators say, pointing to these enforcement actions as examples.

Consumer Finance Protection Bureau [CFPB] Director Rohit Chopra said the agency has “already started some work to continue to muscle up internally when it comes to bringing on board data scientists, technologists and others to make sure we can confront these challenges” and that the agency is continuing to identify potentially illegal activity.

In the past year, the Consumer Finance Protection Bureau said it has fined banks over mismanaged automated systems that resulted in wrongful home foreclosures, car repossessions, and lost benefit payments, after the institutions relied on new technology and faulty algorithms.

Representatives from the Federal Trade Commission, the Equal Employment Opportunity Commission [EEOC], and the Department of Justice, as well as the CFPB, all say they’re directing resources and staff to take aim at new tech and identify negative ways it could affect consumers’ lives.

“One of the things we’re trying to make crystal clear is that if companies don’t even understand how their AI is making decisions, they can’t really use it,” Chopra said. “In other cases, we’re looking at how our fair lending laws are being adhered to when it comes to the use of all of this data.”

EEOC Chair Charlotte Burrows said there will be enforcement against AI hiring technology that screens out job applicants with disabilities, for example, as well as so-called "bossware" that illegally surveils workers.

Burrows also described ways that algorithms might dictate how and when employees can work in ways that would violate existing law.

“If you need a break because you have a disability or perhaps you’re pregnant, you need a break,” she said. “The algorithm doesn’t necessarily take into account that accommodation. Those are things that we are looking closely at."

OpenAI’s top lawyer, at a conference this month, suggested instead an industry-led approach to regulation.

“I think it first starts with trying to get to some kind of standards,” Jason Kwon, OpenAI’s general counsel, told a tech summit in Washington, DC, hosted by software industry group BSA.

“Those could start with industry standards and some sort of coalescing around that. And decisions about whether or not to make those compulsory, and also then what’s the process for updating them, those things are probably fertile ground for more conversation,” Kwon added.

Forum

XS
SM
MD
LG