News

NYC Issues Final Regulations on Automated Employment Decision Tools Law

May 15, 2023

The New York City Department of Consumer and Worker Protection (DCWP) has adopted final regulations regarding Local Law 144, the city’s Automated Employment Decision Tools (AEDT) law. The DCWP will begin enforcing the AEDT law on July 5, 2023.

As we reported in an October 2022 client alert, the AEDT law requires New York City employers to comply with extensive requirements before using an AEDT, including completing an independent bias audit of the tool and providing notice regarding the tool to candidates and employees. Among other things, the final regulations:

  • Modify the definition of “machine learning, statistical modeling, data analytics, or artificial intelligence” to expand the scope of covered AEDTs.
  • Provide additional standards for the bias audit.
  • Expand the scope of required information to be included in publishing results of a bias audit.

We discuss these developments in more detail below.

Broadened scope of covered AEDTs

The AEDT law defines an “automated employment decision tool” as “any computational process” that is “derived from machine learning, statistical modeling, data analytics, or artificial intelligence,” and “issues simplified output,” such as “a score, classification, or recommendation,” that is “used to substantially assist or replace discretionary decision making” for employment decisions that impact people. The final rules eliminate an earlier requirement that the computer-based technique have “inputs and parameters … refined through cross-validation or by using training and testing data,” and therefore expand the scope of tools that can be considered qualifying AEDTs.

Additional standards for bias audits

The rules also impose additional standards for bias audits.

What needs to be audited?

An employer must conduct a bias audit of the AEDT, even if the tool is used merely to “screen at an early point in the application process” and not to make a final hiring decision.

Additional bias audit requirements

  • The selection rate and impact ratio must separately calculate the impact of the AEDT on sex categories, race/ethnicity categories and intersectional categories (e.g., the impact ratio for a selection of Hispanic or Latina candidates versus Black or African American candidates).
  • The audit must indicate the number of individuals the AEDT assessed who are not included in the required calculations because they fall within an unknown category.
  • An auditor may exclude a category that represents less than 2% of the data being used for the bias audit from the required calculations for the impact ratio.

Data used for bias audits

The rules confirm that the audit must use data from the New York City employer or agency’s own historical use of the tool, with limited exceptions.

Auditor’s independence

The rules clarified questions regarding the level of independence required of the auditor conducting a bias audit by identifying the criteria that would make an auditor not independent.

Publication of bias audit results

As described above, the rules expanded the required elements that need to be included in the published summary of bias audits. Employers and agencies now must make publicly available:

  1. The date of the most recent bias audit of the AEDT.
  2. A summary of the results, which should include the source and explanation of the data, the number of individuals the AEDT assessed who fall within an unknown category, and the number of applicants or candidates, the selection or scoring rates, and the impact ratios for all categories.
  3. The distribution date of the AEDT.

Next steps

Employers using a qualifying AEDT should get ahead of the July 5 enforcement date by ensuring their bias audit, notice and publication procedures are compliant with the AEDT law. The final rules clarified that employers using an AEDT need to have conducted the bias audit by July 5, and annually thereafter.

Employers also should be on the lookout for future laws regulating AI-driven tools used in the employment process, including in California. Various federal agencies, including the Equal Employment Opportunity Commission (EEOC), have “expressed concern about potentially harmful uses of automated systems” in a recent Joint Statement on Enforcement Efforts Against Discrimination and Bias in Automated Systems. The EEOC previously announced its plan to target automated systems that intentionally exclude or adversely affect protected groups.

We will continue to monitor developments in this area. We’ve also compiled a list of 10 actionable steps that New York City private-sector employers should take to promote compliance with ever-evolving legal requirements while avoiding common pitfalls.

This content is provided for general informational purposes only, and your access or use of the content does not create an attorney-client relationship between you or your organization and Cooley LLP, Cooley (UK) LLP, or any other affiliated practice or entity (collectively referred to as “Cooley”). By accessing this content, you agree that the information provided does not constitute legal or other professional advice. This content is not a substitute for obtaining legal advice from a qualified attorney licensed in your jurisdiction and you should not act or refrain from acting based on this content. This content may be changed without notice. It is not guaranteed to be complete, correct or up to date, and it may not reflect the most current legal developments. Prior results do not guarantee a similar outcome. Do not send any confidential information to Cooley, as we do not have any duty to keep any information you provide to us confidential. This content may be considered Attorney Advertising and is subject to our legal notices.