Blog

Health Law News

Print PDF

EEOC Casts Spotlight on Use of Artificial Intelligence in Hiring and Other Employment Decisions

Posted on May 24, 2023 in Health Law News

Published by: Hall Render

The Equal Employment Opportunity Commission (“EEOC”) recently issued its technical assistance document addressing the questions related to employers’ use of algorithmic or artificial intelligence decision-making tools under Title VII of the Civil Rights Act, 42 USC § 2000e‑2(k)(1)(A)(1) et seq. (“Title VII”).

The EEOC’s technical assistance documents are not binding and do not have the force and effect of law. The purpose of the latest EEOC’s technical assistance document on artificial intelligence (“AI”) and Title VII is to educate employers and employees about Title VII’s application for the use of software and automated systems in making employment decisions. This new guidance may be considered a warning to employers about the legal implications of using AI in their screening, hiring and other related decisions.

Title VII and AI

While Title VII applies to all employment practices for covered employers, the EEOC’s technical assistance document focuses on the issue of disparate impact, and specifically, whether an employer’s selection procedures, which are used to make employment decisions, like hiring, promotion and firing, disproportionately exclude persons based on race, color, religion, sex or national origin.

The EEOC indicates that the use of AI in employment is ordinarily where “the developer relies partly on the computer’s own analysis of data to determine which criteria to use when making decisions.” The EEOC identified the following AI tools as examples that an employer may incorporate into its employment process:

  • Resume scanners that prioritize applications using keywords;
  • Employee monitoring software that rates employees on the basis of their keystrokes or other factors;
  • “Virtual assistants” or “chatbots” that ask job candidates about their qualifications and reject those who do not meet pre-defined requirements;
  • Video interviewing software that evaluates candidates based on their facial expressions and speech patterns; and
  • Testing software that provides “job fit” scores for applicants or employees regarding their personalities, aptitudes, cognitive skills or perceived “cultural fit” based on their performance on a game or on a more traditional test.

Employer’s Liability Under Title VII for Disparate Impact Where AI Tools Were Designed or Administered by an Outside Vendor

The technical assistance document provides that an employer may still be responsible under Title VII for its use of AI decision-making tools, even if the tools are designed or administered by a vendor or third party. The EEOC suggests that an employer may want to ask a potential vendor what steps it has taken to evaluate whether the tool may cause a disparate impact. Yet, the EEOC noted that even if a vendor is incorrect about its own assessment and the AI tool results in discrimination, the employer may still be liable under Title VII.

The Four-Fifths Rule

The EEOC explains that the “four-fifths rule,” which is generally used to determine whether the selection rate for one group is “substantially” different than the selection rate of another group, can be utilized to evaluate whether a selection procedure has a disparate impact in violation of Title VII. The EEOC outlined the following example of the four-fifths rule:

  • Assume 80 White candidates and 40 Black candidates take a personality test that is scored using an algorithm as part of the job application process.
  • 48 of the White candidates (60%) and 12 of the Black candidates (30%) advance to the next round of the selection process.
  • The ratio of the two rates is 30/60 (50%).
  • Because 30/60 (or 50%) is lower than 4/5 (80%), the four-fifths rule states that the selection rate for Black candidates is substantially different than the selection rate for White candidates, which could be evidence of discrimination against Black candidates.

The EEOC emphasizes that the four-fifths rule is a rule of thumb, but it is not always applicable. According to the EEOC, even if an employer’s selection procedure satisfies the four-fifths rule, such a procedure can still be challenged by the EEOC or an employee in a discrimination claim.

EEOC Encourages Employer Self-Audits

The EEOC encourages employers to conduct ongoing self-audits to determine whether their selection tools have an adverse impact on groups protected by law. Such self-audits should be conducted with the assistance of legal counsel.

AI Tools and the Americans with Disabilities Act

This latest technical assistance document on AI and Title VII follows the EEOCs May 2022 technical assistance document on employers’ use of algorithmic or AI decision-making tools under Title I of the Americans with Disabilities Act (“ADA”). The May 2022 technical assistance document outlined the same above-mentioned examples of AI tools. In its May 2022 technical assistance document, the EEOC addressed the following “most common ways” that employers’ use of algorithmic or AI decision-making tools may violate the ADA:

  • The employer does not provide a reasonable accommodation when using AI tools;
  • Candidates or employees with a disability are “screened out” from consideration for a job or promotion even though they are able to perform the job with or without a reasonable accommodation; and
  • An employer’s use of AI decision-making tools requests candidates or employees to identify or provide information about their disabilities or medical condition.

In its May 2022 technical assistance document, the EEOC also identified practices for employers to consider when using AI decision-making tools to assist with ADA compliance. These practices include, for example, training staff to recognize and process requests for reasonable accommodation, informing candidates and employees who are being rated using AI decision‑making tools that reasonable accommodations are available for individuals with disabilities and providing instructions for requesting such accommodations, ensuring that the AI decision‑making tools only measure abilities or qualifications that are truly necessary for the job and asking an outside vendor to confirm that the AI tool does not ask candidates or employees questions likely to elicit information about a disability unless the inquiries are related to a request for reasonable accommodation.

As the use of software and AI decision-making tools is rapidly developing, employers should continue to watch for additional guidance from the EEOC and other federal and state agencies.

Practical Takeaways and Next Steps

Employers should consider reviewing the extent that they or their vendors are using AI tools in their selection processes. As the use of software and AI decision-making tools is rapidly developing, employers should also watch for additional guidance from the EEOC and other federal and state agencies.

If you have any questions on issues discussed in or related to this post, please contact:

Hall Render blog posts and articles are intended for informational purposes only. For ethical reasons, Hall Render attorneys cannot—outside of an attorney-client relationship—answer specific questions that would be legal advice.