Bradley Merrill Thompson, Member of the Firm in the Health Care & Life Sciences practice, in the firm’s Washington, DC, office, authored an article in The Journal of Robotics, Artificial Intelligence & Law, titled “Why a Data Scientist Needs a Lawyer to Correct Algorithmic Discrimination.”

Following is an excerpt (see below to download the full version in PDF format):

We all have heard the stories. A recruiting app that prefers men. Facial recognition software that cannot recognize Black women. Clinical decision support tools for evaluating kidney disease that often give doctors the wrong advice when the patient is Black. Triage software that puts White people ahead of Black and Brown people. The list is long and growing.

These are real harms to our friends and neighbors, costing them economic opportunities or causing physical injury in the case of health care. For organizations seeking to use artificial intelligence (AI)–powered tools, they also create potentially expensive legal liabilities and damage in the court of public opinion.

Local, state, and federal agencies are racing to implement regulations to address these issues. Regardless of the domain, a common thread in the current and proposed regulations is bias. Soon, the Department of Health and Human Services may require that users of algorithms in health care evaluate those algorithms for bias. New York City presently requires all AI-powered selection and hiring tools to be audited for bias, and several municipalities are currently considering similar regulations. In the European Union, big tech companies will have to conduct annual audits of their AI systems from 2024, and the upcoming AI Act will require audits of “high-risk” AI systems.

But what exactly do the antidiscrimination laws require? While the law is often unclear, just picking age discrimination as one example, some attorneys would argue that:

  • Age may be considered by an airline as “bona fide occupational qualification” for safety reasons.
  • Age may not generally be considered by software companies when hiring new programmers.
  • Age, in certain circumstances, may be considered affirmatively as part of a company’s comprehensive recruitment strategy to help older Americans get programming jobs where the software company can establish that its own hiring patterns disadvantaged older workers historically.
  • Age may not be considered in scheduling access to radiological imaging.
  • Age may be considered when deciding who should receive a particular transplanted organ.
  • Age may not be considered when deciding who should go on a physically demanding business trip.
  • Age might unconsciously be considered in promotions so long as there is no statistical evidence of disparate impact overall.
  • Age may not be considered by a health insurance company in deciding whether to cover an expensive procedure.
  • Age may, and really must, be considered when diagnosing macular degeneration.
  • Age may not be considered in targeting advertising for certain credit services in social media.
  • Age may be considered in targeting advertising for certain healthcare products in social media.

If your head hurts now, that is understandable.

Age, like sex, race, and dozens of other demographic factors, is a protected class in America that is supposed to be free from discrimination. But knowing that hardly provides sufficient guidance for the development of a wide range of algorithms that make, or advise on, decisions impacted by age.

Jump to Page

Privacy Preference Center

When you visit any website, it may store or retrieve information on your browser, mostly in the form of cookies. This information might be about you, your preferences or your device and is mostly used to make the site work as you expect it to. The information does not usually directly identify you, but it can give you a more personalized web experience. Because we respect your right to privacy, you can choose not to allow some types of cookies. Click on the different category headings to find out more and change our default settings. However, blocking some types of cookies may impact your experience of the site and the services we are able to offer.

Strictly Necessary Cookies

These cookies are necessary for the website to function and cannot be switched off in our systems. They are usually only set in response to actions made by you which amount to a request for services, such as setting your privacy preferences, logging in or filling in forms. You can set your browser to block or alert you about these cookies, but some parts of the site will not then work. These cookies do not store any personally identifiable information.

Performance Cookies

These cookies allow us to count visits and traffic sources so we can measure and improve the performance of our site. They help us to know which pages are the most and least popular and see how visitors move around the site. All information these cookies collect is aggregated and therefore anonymous. If you do not allow these cookies we will not know when you have visited our site, and will not be able to monitor its performance.