Adam S. Forman, Member of the Firm in the Employment, Labor & Workforce Management practice, in the firm’s Detroit and Chicago offices, was quoted in SHRM, in “COVID-19 Pandemic Puts Workplace Technology in the Spotlight,” by Lisa Nagele-Piazza.
Following is an excerpt:
The COVID-19 pandemic has elevated the role of technology in the workplace, and more employers are relying on artificial intelligence, machine learning and virtual reality to save money and limit in-person contact.
These technologies can be effective tools for hiring, training and assessing employee performance, as well as creating meaningful interactions during a time of isolation. However, employers must ensure that their use of technology doesn't run afoul of employment and labor laws. …
Machine Learning
Machine learning is defined as "the use and development of computer systems that are able to learn and adapt without following explicit instructions, by using algorithms and statistical models to analyze and draw inferences from patterns in data," according to dictionary.com.
Machine learning is a process by which a machine learns to become intelligent by itself, explained Adam Forman, a management attorney with Epstein Becker Green in Chicago and Detroit, during the session.
Chatbots are an example that allows for natural conversation to answer questions. For example, employees can ask a chatbot how much time they have left in their paid-time-off (PTO) bank.
Chatbots can be used to have introductory conversations with job candidates, schedule interviews, send new-hire paperwork and help with the onboarding process.
"The idea is that if you can take some of these lower-level functions and offload them to a machine, you free up your human resources talent to deal with the more sophisticated issues," Forman said.
But there's a caveat: Employers need to think about potential issues that machine learning hasn't figured out. The chatbot may not know if an employee is asking how much leave is left in a PTO bank or asking questions that could trigger a legal obligation under the Family and Medical Leave Act.
Legal counsel can help organizations that plan to use these products, Forman said. …
Legal Considerations …
Employees may bring a disparate impact claim when a practice that seems neutral has a disproportionate impact on a protected class. For example, an employer may target job candidates that live in zip codes that are close to the office. But that practice could unintentionally lead to race discrimination based on the demographics of the area.
Sometimes employers deliberately discriminate by targeting certain groups for employment, such as younger job candidates rather than older candidates. This may violate the Age Discrimination in Employment Act.
"Accessibility is important," Forman said. Employers need to ensure that hiring tools are accessible for candidates with mental and physical impairments. Additionally, personality tests shouldn't ask questions that would be deemed disability-related inquiries or medical examinations—which are impermissible at the pre-offer stage.
Employers need to ensure that the activities the algorithm is analyzing are related to the essential functions of the job. Work with the product vendor, Forman suggested, to ensure that the algorithm is mapped to those functions.
He said employers can expect more legal challenges as these tools become mainstream.