Breadcrumb

  1. Home
  2. Laws
  3. guidance
  4. Select Issues: Assessing Adverse Impact in Software, Algorithms, and Artificial Intelligence Used in Employment Selection Procedures Under Title VII of the Civil Rights Act of 1964

Select Issues: Assessing Adverse Impact in Software, Algorithms, and Artificial Intelligence Used in Employment Selection Procedures Under Title VII of the Civil Rights Act of 1964

Employers now have a wide variety of algorithmic decision-making tools available to assist them in making employment decisions, including recruitment, hiring, retention, promotion, transfer, performance monitoring, demotion, dismissal, and referral. Employers increasingly utilize these tools in an attempt to save time and effort, increase objectivity, optimize employee performance, or decrease bias.

Many employers routinely monitor their more traditional decision-making procedures to determine whether these procedures cause disproportionately large negative effects on the basis of race, color, religion, sex, or national origin under Title VII of the Civil Rights Act of 1964 (“Title VII”).[1] Employers may have questions about whether and how to monitor the newer algorithmic decision-making tools. The Questions and Answers in this document address this and several closely related issues.

Title VII applies to all employment practices of covered employers, including recruitment, monitoring, transfer, and evaluation of employees, among others. However, the scope of this document is limited to the assessment of whether an employer’s “selection procedures”—the procedures it uses to make employment decisions such as hiring, promotion, and firing—have a disproportionately large negative effect on a basis that is prohibited by Title VII. As discussed below, this is often referred to as “disparate impact” or “adverse impact” under Title VII. This document does not address other stages of the Title VII disparate impact analysis, such as whether a tool is a valid measure of important job-related traits or characteristics.  The document also does not address Title VII’s prohibitions against intentional discrimination (called “disparate treatment”) or the protections against discrimination afforded by other federal employment discrimination statutes.

The Equal Employment Opportunity Commission (“EEOC” or “Commission”) enforces and provides leadership and guidance on the federal equal employment opportunity (“EEO”) laws prohibiting discrimination on the basis of race, color, national origin, religion, and sex (including pregnancy, sexual orientation, and gender identity), disability, age (40 or older) and genetic information. This publication is part of the EEOC’s ongoing effort to help ensure that the use of new technologies complies with federal EEO law by educating employers, employees, and other stakeholders about the application of these laws to the use of software and automated systems in employment decisions.[2]  For related content regarding the Americans with Disabilities Act, see The Americans with Disabilities Act and the Use of Software, Algorithms, and Artificial Intelligence to Assess Job Applicants and Employees.

Background

As a starting point, this section explains the meaning of central terms used in this document—“software,” “algorithm,” and “artificial intelligence” (“AI”)—and how, when used in a workplace, they relate to each other and to basic Title VII principles.

Central Terms Regarding Automated Systems and AI

  • Software: Broadly, “software” refers to information technology programs or procedures that provide instructions to a computer on how to perform a given task or function. “Application software” (also known as an “application” or “app”) is a type of software designed to perform or to help the user perform a specific task or tasks. The United States Access Board is the source of these definitions.

    Many different types of software and applications are used in employment, including automatic resume-screening software, hiring software, chatbot software for hiring and workflow, video interviewing software, analytics software, employee monitoring software, and worker management software.

  • Algorithm: Generally, an “algorithm” is a set of instructions that can be followed by a computer to accomplish some end. Human resources software and applications use algorithms to allow employers to process data to evaluate, rate, and make other decisions about job applicants and employees. Software or applications that include algorithmic decision-making tools are used at various stages of employment, including hiring, performance evaluation, promotion, and termination.
  • Artificial Intelligence (“AI”): Some employers and software vendors use AI when developing algorithms that help employers evaluate, rate, and make other decisions about job applicants and employees. While the public usage of this term is evolving, Congress defined “AI” to mean a “machine-based system that can, for a given set of human-defined objectives, make predictions, recommendations or decisions influencing real or virtual environments.” National Artificial Intelligence Initiative Act of 2020 at section 5002(3). In the employment context, using AI has typically meant that the developer relies partly on the computer’s own analysis of data to determine which criteria to use when making decisions. AI may include machine learning, computer vision, natural language processing and understanding, intelligent decision support systems, and autonomous systems. For a general discussion of AI, which includes machine learning, see National Institute of Standards and Technology Special Publication 1270, Towards a Standard for Identifying and Managing Bias in Artificial Intelligence.

Employers sometimes rely on different types of software that incorporate algorithmic decision-making at a number of stages of the employment process. Examples include: resume scanners that prioritize applications using certain keywords; employee monitoring software that rates employees on the basis of their keystrokes or other factors; “virtual assistants” or “chatbots” that ask job candidates about their qualifications and reject those who do not meet pre-defined requirements; video interviewing software that evaluates candidates based on their facial expressions and speech patterns; and testing software that provides “job fit” scores for applicants or employees regarding their personalities, aptitudes, cognitive skills, or perceived “cultural fit” based on their performance on a game or on a more traditional test. Each of these types of software might include AI. In the remainder of this document, we use the term “algorithmic decision-making tool” broadly to refer to all these kinds of systems. 

Title VII

Title VII generally prohibits employment discrimination based on race, color, religion, sex (including pregnancy, sexual orientation, and gender identity), or national origin.

  • Title VII generally prohibits intentional discrimination, or “disparate treatment” in employment, including employment tests that are “designed, intended or used to discriminate because of race, color, religion, sex or national origin.”[3] Disparate treatment is not the focus of this technical assistance.
  • Title VII also generally prohibits employers from using neutral tests or selection procedures that have the effect of disproportionately excluding persons based on race, color, religion, sex, or national origin, if the tests or selection procedures are not “job related for the position in question and consistent with business necessity.”[4] This is called “disparate impact” or “adverse impact” discrimination. Disparate impact cases typically involve the following questions:[5]
    • Does the employer use a particular employment practice that has a disparate impact on the basis of race, color, religion, sex, or national origin? For example, if an employer requires that all applicants pass a physical agility test, does the test disproportionately screen out women? This issue is the focus of this technical assistance.
    • If the selection procedure has a disparate impact based on race, color, religion, sex, or national origin, can the employer show that the selection procedure is job-related and consistent with business necessity? An employer can meet this standard by showing that it is necessary to the safe and efficient performance of the job. The selection procedure should therefore be associated with the skills needed to perform the job successfully. In contrast to a general measurement of applicants’ or employees’ skills, the selection procedure must evaluate an individual’s skills as related to the particular job in question.
    • If the employer shows that the selection procedure is job-related and consistent with business necessity, is there a less discriminatory alternative available? For example, is another test available that would be comparably as effective in predicting job performance but would not disproportionately exclude people on the basis of their race, color, religion, sex, or national origin?
       
  • In 1978, the EEOC adopted the Uniform Guidelines on Employee Selection Procedures (“Guidelines”) under Title VII.[6] These Guidelines provide guidance from the EEOC for employers about how to determine if their tests and selection procedures are lawful for purposes of Title VII disparate impact analysis.[7] 

Questions and Answers

1.     Could an employer’s use of an algorithmic decision-making tool be a “selection procedure”? 

Under the Guidelines, a “selection procedure” is any “measure, combination of measures, or procedure” if it is used as a basis for an employment decision.[8] As a result, the Guidelines would apply to algorithmic decision-making tools when they are used to make or inform decisions about whether to hire, promote, terminate, or take similar actions toward applicants or current employees.

2.     Can employers assess their use of an algorithmic decision-making tool for adverse impact in the same way that they assess more traditional selection procedures for adverse impact? 

As the Guidelines explain, employers can assess whether a selection procedure has an adverse impact on a particular protected group by checking whether use of the procedure causes a selection rate for individuals in the group that is “substantially” less than the selection rate for individuals in another group.[9]   

If use of an algorithmic decision-making tool has an adverse impact on individuals of a particular race, color, religion, sex, or national origin, or on individuals with a particular combination of such characteristics (e.g., a combination of race and sex, such as for applicants who are Asian women), then use of the tool will violate Title VII unless the employer can show that such use is “job related and consistent with business necessity” pursuant to Title VII.[10]

3.     Is an employer responsible under Title VII for its use of algorithmic decision-making tools even if the tools are designed or administered by another entity, such as a software vendor?

In many cases, yes. For example, if an employer administers a selection procedure, it may be responsible under Title VII if the procedure discriminates on a basis prohibited by Title VII, even if the test was developed by an outside vendor. In addition, employers may be held responsible for the actions of their agents, which may include entities such as software vendors, if the employer has given them authority to act on the employer’s behalf.[11] This may include situations where an employer relies on the results of a selection procedure that an agent administers on its behalf.

Therefore, employers that are deciding whether to rely on a software vendor to develop or administer an algorithmic decision-making tool may want to ask the vendor, at a minimum, whether steps have been taken to evaluate whether use of the tool causes a substantially lower selection rate for individuals with a characteristic protected by Title VII. If the vendor states that the tool should be expected to result in a substantially lower selection rate for individuals of a particular race, color, religion, sex, or national origin, then the employer should consider whether use of the tool is job related and consistent with business necessity and whether there are alternatives that may meet the employer’s needs and have less of a disparate impact. (See Question 7 for more information.) Further, if the vendor is incorrect about its own assessment and the tool does result in either disparate impact discrimination or disparate treatment discrimination, the employer could still be liable.

4.     What is a “selection rate”?

“Selection rate” refers to the proportion of applicants or candidates who are hired, promoted, or otherwise selected.[12] The selection rate for a group of applicants or candidates is calculated by dividing the number of persons hired, promoted, or otherwise selected from the group by the total number of candidates in that group.[13] For example, suppose that 80 White individuals and 40 Black individuals take a personality test that is scored using an algorithm as part of a job application, and 48 of the White applicants and 12 of the Black applicants advance to the next round of the selection process. Based on these results, the selection rate for Whites is 48/80 (equivalent to 60%), and the selection rate for Blacks is 12/40 (equivalent to 30%). 

5.     What is the “four-fifths rule”?

The four-fifths rule, referenced in the Guidelines, is a general rule of thumb for determining whether the selection rate for one group is “substantially” different than the selection rate of another group. The rule states that one rate is substantially different than another if their ratio is less than four-fifths (or 80%).[14]    

In the example above involving a personality test scored by an algorithm, the selection rate for Black applicants was 30% and the selection rate for White applicants was 60%. The ratio of the two rates is thus 30/60 (or 50%). Because 30/60 (or 50%) is lower than 4/5 (or 80%), the four-fifths rule says that the selection rate for Black applicants is substantially different than the selection rate for White applicants in this example, which could be evidence of discrimination against Black applicants. 

6.     Does compliance with the four-fifths rule guarantee that a particular employment procedure does not have an adverse impact for purposes of Title VII? 

The four-fifths rule is merely a rule of thumb.[15] As noted in the Guidelines themselves, the four-fifths rule may be inappropriate under certain circumstances. For example, smaller differences in selection rates may indicate adverse impact where a procedure is used to make a large number of selections,[16] or where an employer’s actions have discouraged individuals from applying disproportionately on grounds of a Title VII-protected characteristic.[17] The four-fifths rule is a “practical and easy-to-administer” test that may be used to draw an initial inference that the selection rates for two groups may be substantially different, and to prompt employers to acquire additional information about the procedure in question.[18]

Courts have agreed that use of the four-fifths rule is not always appropriate, especially where it is not a reasonable substitute for a test of statistical significance.[19] As a result, the EEOC might not consider compliance with the rule sufficient to show that a particular selection procedure is lawful under Title VII when the procedure is challenged in a charge of discrimination.[20] (A “charge of discrimination” is a signed statement asserting that an employer, union, or labor organization is engaged in employment discrimination. It requests EEOC to take remedial action. For more information about filing charges of discrimination with the EEOC, visit the EEOC’s website (https://www.eeoc.gov/.)

For these reasons, employers that are deciding whether to rely on a vendor to develop or administer an algorithmic decision-making tool may want to ask the vendor specifically whether it relied on the four-fifths rule of thumb when determining whether use of the tool might have an adverse impact on the basis of a characteristic protected by Title VII, or whether it relied on a standard such as statistical significance that is often used by courts.

7.     If an employer discovers that the use of an algorithmic decision-making tool would have an adverse impact, may it adjust the tool, or decide to use a different tool, in order to reduce or eliminate that impact? 

Generally, if an employer is in the process of developing a selection tool and discovers that use of the tool would have an adverse impact on individuals of a particular sex, race, or other group protected by Title VII, it can take steps to reduce the impact or select a different tool in order to avoid engaging in a practice that violates Title VII. One advantage of algorithmic decision-making tools is that the process of developing the tool may itself produce a variety of comparably effective alternative algorithms. Failure to adopt a less discriminatory algorithm that was considered during the development process therefore may give rise to liability.[21]

The EEOC encourages employers to conduct self-analyses on an ongoing basis to determine whether their employment practices have a disproportionately large negative effect on a basis prohibited under Title VII or treat protected groups differently. Generally, employers can proactively change the practice going forward.[22]

Individuals who believe that they have been discriminated against at work because of their race, color, religion, sex (including pregnancy, gender identity, and sexual orientation), national origin, age (40 or older), disability, or genetic information may file a Charge of Discrimination with the EEOC.

There are strict time limits for filing a charge; to learn more about those see: Time Limits For Filing A Charge.

Charges may be filed through EEOC’s Online Public Portal at https://publicportal.eeoc.gov. For additional information on charge filing, visit the EEOC’s website (https://www.eeoc.gov) or a local EEOC office (see https://www.eeoc.gov/field-office for contact information), or contact the EEOC by phone at 1-800-669-4000 (voice), 1-800-669-6820 (TTY), or 1-844-234-5122 (ASL Video Phone).

The information in this document is not new policy; rather, this document applies principles already established in the Title VII statutory provisions as well as previously issued guidance. The contents of this publication do not have the force and effect of law and are not meant to bind the public in any way. This publication is intended only to provide clarity to the public regarding existing requirements under the law. As with any charge of discrimination filed with the EEOC, the Commission will evaluate alleged Title VII violations involving the use of software, algorithms, artificial intelligence, and algorithmic decision-making tools based on all of the facts and circumstances of the particular matter and applicable legal principles.

 


[1] 42 U.S.C. § 2000e-2(k)(1)(A)(1). Title VII is found at §§ 2000e–2000e-17. 

[2] The EEOC website provides additional resources and information on this subject. See generally Artificial Intelligence and Algorithmic Fairness Initiative, Equal Emp’t Opportunity Comm’n, https://www.eeoc.gov/ai (last visited April 13, 2023); see also Meeting of January 31, 2023—Navigating Employment Discrimination In AI and Automated Systems: A New Civil Rights Frontier, Equal Emp’t Opportunity Comm’n, https://www.eeoc.gov/meetings/meeting-january-31-2023-navigating-employment-discrimination-ai-and-automated-systems-new (last visited April 13, 2023). The Commission invited written comments from the public for 15 days after the meeting. The comments were made available to members of the Commission and to Commission staff working on the matters discussed at the meeting, including comments from industry groups, vendors, and civil rights groups, among others.  

[3] 42 U.S.C. § 2000e-2(h) (discussing professionally developed tests); see also §2000e-2(a) (generally prohibiting discrimination on the basis of race, color, national origin, sex or religion by covered employers), (c) (same, with respect to labor organizations), (d) (same, with respect to training programs).  

[4] Id. at § 2000e-2(a)(2), (k).

[5] See 42 U.S.C. § 2000e-2(k). This method of analysis is consistent with the seminal Supreme Court decision about disparate impact discrimination, Griggs v. Duke Power Co., 401 U.S. 424 (1971).

[6] See 29 C.F.R. part 1607. The Guidelines were adopted simultaneously by other federal agencies under their authorities. See Uniform Guidelines on Employee Selection Procedures, 43 Fed. Reg. 38,290 (Aug. 25, 1978) (adopted by the Office of Federal Contract Compliance Programs at 41 C.F.R. part 60-3, by the Civil Service Commission at 5 C.F.R. § 300.103(c), and by the Department of Justice at 28 C.F.R. § 50.14).   

[7]  The Guidelines use the term “adverse impact”; other sources use “disparate impact.” This document uses the terms “adverse impact” and “disparate impact” interchangeably.

[8] See 29 C.F.R. § 1607.16(Q).

[9] 29 C.F.R. § 1607.16(B).

[10] 42 U.S.C. § 2000e-2(k)(1); 29 C.F.R. § 1607.3(A).

[11] EEOC, Compliance Manual Section 2 Threshold Issues § 2-III.B.2 (May 12, 2000), https://www.eeoc.gov/laws/guidance/section-2-threshold-issues#2-III-B-2.

[12] 29 C.F.R. § 1607.16(R).

[13] See EEOC, Questions and Answers to Clarify and Provide a Common Interpretation of the Uniform Guidelines on Employee Selection Procedures, Q&A 12 (Mar. 1, 1979) [hereinafter Questions and Answers], https://www.eeoc.gov/laws/guidance/questions-and-answers-clarify-and-provide-common-interpretation-uniform-guidelines.

[14] 29 C.F.R. §§ 1607.4(D), 1607.16(B).

[15] See 29 C.F.R. § 1607.4(D); see also Uniform Guidelines on Employee Selection Procedures, 43 Fed. Reg. 38,290, 38,291 (Aug. 25, 1978) (referring to the four-fifths rule as a “rule of thumb”); id. at 38,291 (explaining why the four-fifths rule was adopted as a “rule of thumb”); Questions and Answers, supra note 13, at Q&A 20 (answering the question of why the four-fifths rule is called a “rule of thumb”).

[16] Questions and Answers, supra note 13, at Q&A 22; see also 29 C.F.R. § 1607.4(D).

[17] 29 C.F.R. § 1607.4(D); see also Uniform Guidelines on Employee Selection Procedures, 43 Fed. Reg. at 38,291 (“[A]n employer’s reputation may have discouraged or ‘chilled’ applicants of particular groups from applying because they believed application would be futile. The application of the ‘4/5ths’ rule in that situation would allow an employer to evade scrutiny because of its own discrimination.”)

[18] Questions and Answers, supra note 13, at Q&A 19, 24; see also Uniform Guidelines on Employee Selection Procedures, 43 Fed. Reg. at 38,291 (“[The four-fifths rule] is not a legal definition of discrimination.”).

[19] See, e.g., Isabel v. City of Memphis, 404 F.3d 404, 412 (6th Cir. 2005) (rejecting the argument that “a test’s compliance with the four fifths rule definitively establishes the absence of adverse impact.”); Jones v. City of Boston, 752 F.3d 38, 46–54 (1st Cir. 2014) (rejecting the use of the four-fifths rule to evaluate a test with a large sample size); Howe v. City of Akron, 801 F.3d 718, 743 (6th Cir. 2015) (“[The Sixth Circuit] ha[s] used the four-fifths rule as the starting point to determine whether plaintiffs alleging disparate impact have met their prima facie burden, although we have used other statistical tests as well.”); Questions and Answers, supra note 13, at Q&A 20, 22. 

[20] Although the Guidelines state that federal agencies will consider whether a selection procedure meets the four-fifths rule when determining whether to take an “enforcement action,” the Guidelines specifically exempt findings of reasonable cause, conciliation processes, and the issuance of right to sue letters from the definition of “enforcement action,” where such findings, conciliation processes, and issuances are based on individual charges of discrimination filed under Title VII. 29 C.F.R. § 1607.16(I). The Guidelines thus do not require the Commission to base a determination of discrimination on the four-fifths rule when resolving a charge. 

[21] See 42 U.S.C. § 2000e-2(k)(1)(A)(ii).

[22] See  EEOC, Employment Tests and Selection Procedures (Dec. 1, 2007), https://www.eeoc.gov/laws/guidance/employment-tests-and-selection-procedures; EEOC, Compliance Manual Section 15 Race and Color Discrimination § IX (Apr. 19, 2006), https://www.eeoc.gov/laws/guidance/section-15-race-and-color-discrimination. Employers should also be aware of how the disparate impact and disparate treatment portions of Title VII may interact. See Ricci v. DeStefano, 557 U.S. 577 (2009).