Bradley Merrill Thompson Quoted in “Who Is Liable for Faulty Artificial Intelligence in Health Care?”Bloomberg BNA Medical Devices Law & Industry Report December 6, 2017
Bradley Merrill Thompson, a Member of the Firm in the Health Care and Life Sciences practice, in the firm’s Washington, DC, office, was quoted in the Bloomberg BNA Medical Devices Law & Industry Report, in “Who Is Liable for Faulty Artificial Intelligence in Health Care?” by Sara Merken. (Read the full version – subscription required.)
Following is an excerpt:
Using artificial intelligence in health-care treatment decisions raises questions of who is liable and what standards apply if a patient is harmed.
There is no legal standard to address AI liability in the health space, and the health-care and software industries need to proactively come together to limit negligence in designing the tools, attorneys and technology professionals contacted by Bloomberg Law said. Artificial intelligence and predictive analytics software developers have a responsibility to make the data accessible and clinically actionable to protect doctors and developers from liability, they added. …
Developers of clinical decision support software (CDS), or systems that provide clinicians with knowledge and patient-specific information to enhance decision-making, can aim to avoid liability through a Food and Drug Administration exemption, Bradley Merrill Thompson, an attorney with Epstein Becker & Green PC, told Bloomberg Law. Some CDS systems rely on machine learning, a form of AI that can analyze data and learn from past experiences.
Thompson counsels medical device and other companies on FDA issues, and serves as general counsel for the CDS Coalition, a group that focuses on developing a proposal for defining and differentiating regulated and unregulated CDS software. He’s also a Bloomberg Law advisory board member.
Many developers are working to make use of an exemption from FDA regulation. The 21st Century Cures Act removes some CDS software from being regulated by the FDA if the tool uses patient-specific information and gives health-care professionals a reasonable opportunity to review the underlying basis for a recommendation made by a machine, allowing doctors to disagree with the suggestion if they know how the algorithm came to its conclusion, Thompson said.
The FDA has said it plans to publish guidance in December to explain how the law is being implemented, Thompson said. In the interim, the CDS Coalition created voluntary industry guidelines to help developers qualify for the exemption and avoid liability.
The guidelines, published in August, help manufacturers ensure that the person using the technology is a qualified professional, and help companies design the products to reveal as much information as possible about how the machine works to instill a high level of confidence in the data, he said.