Workers with disabilities are looking to federal regulators to crack down on artificial intelligence tools that could potentially pose a bias against them.
At a recent American Bar Association event, U.S. Equal Employment Opportunity Commission Chair Charlotte Burrows said she is particularly interested in guidance that could protect people with disabilities from bias in AI tools. As many as 83% of employers, and as many as 90% among Fortune 500 companies, are using some form of automated tools to screen or rank candidates for hiring, according to Burrows.
At issue is the potential for AI-powered games or personality tests used for hiring or performance evaluations to be more difficult for people with intellectual disabilities, for example. AI software that tracks a candidate’s speech or body language during their interview also could create a bias against people with speech impediments, people with visible disabilities, or those whose disabilities affect their movements. …
Who is Liable?
The 1978 guidelines also don’t specify liability for vendors of hiring tools. AI vendors often advertise their products as free of bias, but when bias is found, the discrimination claim would fall squarely on the employer unless there is a shared liability clause in their vendor contracts.
“More and more we’re seeing vendors get out ahead of this issue and be prepared to work with employers on this issue, but because the ultimate liability rests with the employer, they really have to take the initiative to understand how this will have an impact,” said Nathaniel M. Glasser, a partner at Epstein Becker Green who works with employers and AI vendors.
The guidelines, which predate the Americans with Disabilities Act, focus primarily on discrimination based on race and gender. Adapting AI tools to avoid bias against disabled people is more complicated because disabilities can take many forms and workers are not legally required to disclose that they have a disability.
Glasser said the conversation around AI bias has increasingly shifted to include perspectives from disabled workers. AI tools are useful to employers who need to sift through troves of resumes or asses relevant skills, and if used correctly, could be less biased than traditional assessments, he noted. The attorney said he advises clients to conduct their own due diligence when it comes to designing and implementing AI tools.
“It’s important for employers to understand how the tool works and what accommodations may be provided in the tool itself, but also have a plan for requests for reasonable accommodation from people who aren’t able to reasonably utilize the tool or be evaluated by that tool due to the specific nature of their disability,” Glasser said.