DOJ & FTC targeting discrimination & bias in companies’ automated systems
Can automated payment systems be guilty of discriminating against certain groups of people? The federal government says yes, they absolutely can – and the feds plan to fine and possibly prosecute companies that rely on automated systems to do business.
The Federal Trade Commission (FTC) is planning a multi-agency enforcement push against “discrimination and bias in automated systems” in conjunction with the Consumer Financial Protection Bureau, Department of Justice’s (DOJ) Civil Rights Division and Equal Employment Opportunity Commission.
Companies that are using AI-based systems or are planning to do so could face a civil rights case if the feds can show a bias based on customers’ race, gender, sexual orientation or religion. The Biden administration’s chief focus here is racial discrimination against African American companies and customers, similar to the EPA forming an office devoted entirely to environmental justice.
Historical data could lead to claims of discrimination
According to the FTC’s news release announcing the multi-pronged enforcement action, “many automated systems rely on vast amounts of data to find patterns or correlations, and then apply those patterns to new data to perform tasks or make recommendations and predictions. While these tools can be useful, they also have the potential to produce outcomes that result in unlawful discrimination.”
In the case of a payment system, for example, red-flagging of customers that pay late, followed by multiple emails to clients reminding them of their debts, could lead to trouble for companies. Also: Businesses that are seen as “too aggressive” in their credit & collection practices may spark whistleblower complaints. And it goes without saying, lenders should be on alert.
The agencies plan to cast a very wide net. They’ll be taking a close look at data and datasets used in AI systems based on the premise that data can be inherently discriminatory: “Automated system outcomes can be skewed by unrepresentative or imbalanced datasets, datasets that incorporate historical bias, or datasets that contain other types of errors. Automated systems also can correlate data with protected classes, which can lead to discriminatory outcomes.”
In addition to data and datasets, the agencies will focus on how much users understand their automated systems. The FTC says many AI systems are little more than “‘black boxes’ whose internal workings aren’t clear to most people … this lack of transparency often makes it all the more difficult for developers, businesses, and individuals to know whether an automated system is fair.”
Free Training & Resources
White Papers
Provided by Anaplan
White Papers
Provided by UJET
Further Reading
Trying to figure out where a number came from? Excel’s Trace Precedents feature lets you map formulas visually. This is perfect for audit...
2023 promises to be a very challenging year for Accounts Receivable departments. The word from many in the A/R and credit & collections...
The start of the new year looks a lot like what we saw over 2022: Businesses are struggling to pay their bills. Late payments are highe...
A shift in travel and expense (T&E) management is becoming too big to ignore, and it centers around non-employee travel expenses. Tr...
Cloud-based A/R is rapidly becoming a must-have for businesses of all sizes. But where is A/R technology heading, and what should you expec...
Companies are likely to see more customers stretch payments in 2024. Don’t be surprised if a steady client ghosts you either. Reas...