Newly emerging artificial intelligence (AI) technologies could hold a promising solution to streamlining certain employment practices and processes of hiring applicants in a number of different industries. Historically, both federal courts and regulatory enforcement agencies have been opposed to the overall usage of AI tools, having scrutinized them heavily under local, state and federal anti-discrimination laws.
In what was a welcome piece of news for New York-based employers, the New York City Department of Consumer and Worker Protection recently published a set of proposed rules that could drastically reshape the process of hiring and employment-based legislation still awaiting approval. For city employers who heavily utilize automated employment decision tools (AEDT) for hiring, these proposed rules will provide some initial guidance on the laws surrounding artificial intelligence, with hopes of clarifying the ambiguous AI law the city enacted back in 2021.
The law, which won’t fully go into effect until January 1, 2023, prohibits employers from using any form of AEDT unless a bias audit is completed by an independently sourced auditor and notice requirements are fully met. The proposed rules define an independent auditor as any “person or group that is not involved in using or developing an AEDT,” but also clarify that an employer can use a bias audit commissioned by a vendor to meet the AI law’s requirements. Additionally, the law will restrict employers from using AEDT in promotion decisions and notify future applicants and current employees of the tools usage on all future job listings.
Simone Francis, an attorney at the technology-based firm Ogletree Deakins, has noticed the growing prevalence of AI hiring tools within the employment market. “There’s certainly been a lot of conversation about the ability of AI to potentially eliminate biases, but the law is intended to put certain processes in place to ensure that AI is being used in a way that does not lead to unintended results, including results that would conflict with existing anti-discrimination laws. The New York City law specifically says that you have to have an independent audit, which means you cannot just rely on the vendor and the vendor’s assurances. We’re still trying to develop some understanding of what the city means by that,” stated Francis, as reported by HR Executive.
However, the proposed rules do not expressly address other major compliance issues that could begin to affect city employers, including the exact requirements in cases of an employee who opts out of an AEDT. Additionally, employers are still uncertain whether the AI law will require bias audits to be updated annually or if a singular audit conducted prior to the tools usage will suffice. Continued Francis, “It’s important to understand how AI tools are used. Employers must get around to that because how they’re actually used could either trigger application of this law in New York City or in other jurisdictions in the future.”