Case Studies

Helping HR Departments Minimize Risk When Using AI Tools in Hiring

Human resources (HR) departments are progressively turning to artificial intelligence (AI) tools to assist in the process of recruiting and vetting job applicants. AI vendors claim that their tools can quickly and seamlessly identify the best candidates for open positions. Despite such claims, these tools have the potential to perpetuate stereotypes, disparately impact certain populations, and present troublesome issues relating to people with disabilities—leaving employers vulnerable to class or collective actions.

Epstein Becker Green understands both the promise and pitfalls of AI, and our attorneys have experience counseling clients on how to maximize the benefits of AI in recruitment and selection while minimizing potential risks. Recently, two clients—a major financial institution and a restaurant chain—sought Epstein Becker Green’s help in evaluating AI vendors for their employee recruitment, selection, and onboarding functions. Our attorneys assisted our clients in assessing the product offerings, reviewing vendor contracts, identifying the appropriate questions to ask the vendors about their AI products, monitoring and testing those products, and evaluating whether those products would raise red flags from a legal perspective.

Protecting Companies That Use Chatbots for Certain HR Functions

Increasingly, companies are using “chatbots” for lower-level human resources (HR) functions, such as tracking employees’ paid time off, leave, or benefits. (A “chatbot” is a computer program that uses artificial intelligence (AI) to simulate conversation with human users—for example, a chatbot may appear in a pop-up window on a website that asks whether a visitor needs any assistance.) Some companies are even evaluating whether they should complement humans with chatbots to take in internal complaints of discrimination or harassment. Although management and HR personnel may embrace chatbots to increase efficiency and reduce subjectivity, there are legal risks involved with adopting this technology. 

Epstein Becker Green provides advice and counsel to clients that use, or are considering using, chatbots and want to mitigate their legal risks. We recently assisted clients in the retail and financial services industries with their chatbots. Our attorneys evaluated the questions asked by these chatbots to ensure that the algorithms, among other things, are able to distinguish between various types of employee requests—such as requests for sick leave, Family and Medical Leave Act leave, a regular day off, or an accommodation under the Americans with Disabilities Act. In addition, we made sure that our clients' chatbots have built-in processes to elevate certain matters for human review and are creating a favorable experience for employee-users.