Potential for artificial intelligence bias for enforcement accountability

On Monday, April 19, 2021, a Federal Trade Commission (FTC) blog post warned companies to ensure that their artificial intelligence (AI) does not reflect racial or gender bias, and it noted that it does not doing so can result in “deception, discrimination – and FTC law enforcement action.” “1 While this isn’t the first time the FTC has addressed the issue of bias in AI, the agency has now made it clear that if companies don’t hold themselves accountable for all actions – human and AI – they should. “Be ready for the FTC.” do it for [them]. “2

In recent years, the number of companies investing in AI has skyrocketed. Automating some business operations has been shown to increase efficiency, improve customer service, and speed up production. But when AI is based on a biased data set – undervaluing or omitting protected races, genders, or other classes – the resulting system is inevitably biased, despite its purely neutral programming. Recent movements for social justice, #Me too at #BLM and beyond, have drawn attention to the prejudices that can arise in life and in business. Now regulators are also paying attention.

Bias in algorithms can cause applications to underperform, resulting in overall business underperformance due to missed opportunities and incorrect forecasts. This bias can negatively impact a company’s reputation, hurt the company’s bottom line, and alienate a large portion of its consumers. In recent years, some companies have chosen to tackle existing practices that disadvantage certain groups of people and favor others. Earlier this week, advice from the FTC made it clear that recognizing and correcting AI biases is not only a moral responsibility, but also a legal obligation.


Section 5 of the FTC Law

Section 5 (a) of the FTC Act prohibits “unfair or deceptive acts or practices in or affecting commerce”.3 Unfair and deceptive practices can include practices that may mislead a reasonable consumer, as well as those that may do more harm than good.4

As the mandate is broad, so are the possible bases for enforcement actions. In its recent statement, the FTC explicitly categorized the sale or use of racially biased algorithms as an unfair and deceptive practice. The FTC also warns against “digital redlining” – the use of protected characteristics such as race, religion or gender to determine which consumers a business will target with online advertisements. In 2019, the Department of Housing and Urban Development accused Facebook of violations of the Fair Housing Act for targeting its ads too narrowly along protected class lines.5 Now the FTC has threatened to use the FTC Act to protect consumers from this type of prohibited targeting as well.

Fair Credit Reporting Act

The Fair Credit Reporting Act (FCRA) stipulates how consumer information can be collected and used for credit reporting.6 FCRA’s Congressional Statement of Intent emphasizes the need to ensure that consumer news agencies act with “fairness, impartiality, and respect for the consumer’s right to privacy ”, as unfair credit assessment methods“ undermine the public trust essential to the continued functioning of the banking system ”.7 When a business uses an algorithm to determine eligibility for credit, housing, or other benefits, in order to produce an impartial and fair result, it must be based on unbiased data.

Equal Credit Opportunity Act

The Equal Credit Opportunity Act (ECOA) prohibits discrimination in credit on the basis of protected classes or on the basis that a person is receiving public assistance.8 When consumers apply for credit – whether to buy a home, start a business, or fulfill their dreams – creditors are not allowed to consider factors such as race and gender in deciding whether the consumer will be approved.

In addition, the ECOA prohibits not only intentional discrimination, but also unintentional discrimination which results in disparate impact. For example, if an algorithm is programmed to deny credit to consumers based on their zip code (a seemingly neutral factor) if the zip code has a population that is primarily made up of a racial minority, the FTC could challenge the practice by as a violation of the ECOA.9


Remember you are responsible

Keep using and benefiting from AI, but remember that you are responsible not only for the actions of the people you employ, but also for the technology you use. If you are building your own AI systems, be sure to test and modify regularly to eliminate bias. If you are using someone else’s AI, it is essential to understand the technology and data used by the system. If in doubt, consider hiring a lawyer to take a closer look at your existing AI systems and assess your risk of non-compliance with FTC mandates.

Think of an algorithm as a brain that learns and gets smarter over time. Regularly test for bias by modifying or eliminating certain factors (such as protected characteristics) from the decision-making process, and examine how this affects the results. In their book, AI for lawyers, Noah Waisberg and Dr Alexander Hudek, co-founders of Kira Systems, remind us that while biases can be built into AI systems, we remain in control of the data used by those systems, and therefore, we are in control. our own destiny.ten Proactively remove traits you don’t want the algorithm to consider and add data to fill in the gaps. If a program excludes a gender from its data set, add more data to ensure the results will be unbiased.

Always strive to under-promise and over-deliver

It might be a cliché, but “under-promise and over-deliver” is a very useful mantra. The FTC explicitly warns against fair or unbiased results that hold too much promise for consumers when the underlying data is in fact biased – whether intentionally or inadvertently.11 For example, don’t promise that your AI will make “100% unbiased hiring decisions” if the algorithm is based on data from only one race or gender. The FTC warns that “[t]This can result in deception, discrimination, and FTC law enforcement action.12 While the end goal is to correct this data, it is important to accurately recognize and accept the imperfect nature of the data from the start.

Transparency is the key

Finally, be transparent. Be genuine and specific. In addition to being transparent about efforts to improve results, be transparent about the data you rely on. Without transparency, AI will remain biased. Additionally, the FTC praises transparency as it creates the ability for others to detect and correct biases that a business may not be aware of on its own.13

While a recent U.S. Supreme Court ruling casts doubt on the extent of the FTC’s power to enforce these guidelines in the future, it is still not to be taken lightly.14 At a minimum, the FTC can still enforce injunctions, and businesses should be wary of the negative press that accompanies it and the high costs that accompany defending against enforcement action.

New technologies are never perfect, but we can collectively improve them over time. Rather than shying away from AI because of these imperfections, companies should embrace FTC advice as an opportunity to reflect on their own technology and find ways to improve it: for the efficient operation of the FTC. company, its reputation and consumers above all.

1 Elisa Jillson, Strive for truth, justice and fairness in your business’s use of AI, FTC.GOV (April 19, 2021, 9:43 a.m.), https://www.ftc.gov/news-events/blogs/business-blog/2021/04/aiming-truth-fairness-equity-your-companys-use-ai.

2 Username.

3 15 USC § 45 (a) (1).

4 Fed. Comm’n Trade, A Brief Overview of the Federal Trade Commission Investigative, Enforcement, and Regulatory Authority, FTC.GOV (October 2019), https://www.ftc.gov/about-ftc/what-we-do/enforcement-authority.

5 Facebook, Inc., HUD ALJ # 01-18-0323-8 (March 28, 2019).

6 15 USC §§ 1681 et seq.

7 Username. § 1681 (a) (emphasis added).

8 Username. §§ 1691 et seq.

9 Andrew Smith, Use of artificial intelligence and algorithms, FTC.GOV (April 8, 2020, 9:58 a.m.), https://www.ftc.gov/news-events/blogs/business-blog/2020/04/using-artificial-intelligence-algorithms.


11 Jillson, above note 1.

12 Identifier.

13 Identifier.

14 AMG cap. Mgmt., LLC v Fed. Trade Comm’n, n ° 19-508, op. to 1 (April 22, 2021) (finding that Section 13 (b) of the FTC Act does not authorize the FTC to seek monetary relief such as restitution or restitution).

About Jimmie T.

Check Also

Nobody Asked Me But… (September 21, 2022)

By DR. LARRY MOSES No one asked me but… I believe the November election is …