Alaap B. Shah, Member of the Firm in the Health Care & Life Sciences practice, in the firm’s Washington, DC, office, was featured in an interview in Behavioral Healthcare Executive, in “Artificial Intelligence Brings Potential—and Challenges—to Behavioral Health, Addiction Treatment,” by Tom Valentino.
Following is an excerpt:
Artificial intelligence offers the potential to reshape behavioral healthcare and addiction treatment in the coming years. Are providers, payers and regulators prepared to keep pace?
Alaap B. Shah, a member of the firm in Epstein Becker Green’s Health Care and Life Sciences practice, recently spoke with BHE about the emerging role of AI in behavioral healthcare and addiction treatment, potential legal and regulatory changes, and how providers can leverage their use of AI to maximize reimbursement.
How would you describe the role artificial intelligence is currently playing in behavioral healthcare and addiction treatment?
There has obviously been a huge surge in interest in leveraging artificial intelligence to solve problems across healthcare, and behavioral health is certainly not an exception. A lot of times, what people are trying to do with artificial intelligence in behavioral health depends on the segment you are trying to impact. There are some folks of the mindset we need to be empowering physicians to do their jobs better—smarter, more efficiently, more effectively to reduce behavioral health issues, whether it’s addiction management, addiction prevention, suicide prevention or other issues people may have. Some people are also taking the tact to leverage artificial intelligence more from a direct-to-patient perspective. They’re of the mindset that the old paradigm of how we’ve taken care of people with behavioral health or addiction issues is broken, it’s too slow and too reactive. And when I say “broken” or “reactive,” I’m referring to things that are geared toward inpatient settings where the person already went through their traumatic event. They already had their overdose or they’re having their psychiatric issue that has led them to the hospital door, or it’s something in the recovery phase after you’ve already gone through a traumatic event.
All of this is a little too late in some people’s view. What some people are trying to do with artificial intelligence is get ahead of that process and disrupt it in some manner and say we can discover these issues much earlier. Perhaps we shouldn’t wait until someone shows up at the hospital door or has to go through recovery programs after the fact. Perhaps we can detect whether they have suicidal tendencies earlier or help them track their drug use and alert them that they’re taking too many opioids and are at risk of becoming addicted. There are lots of ways people are trying to inject artificial intelligence through the continuum of care.