I was on NPR’s Studio 1A to talk about mental health tech

It was very early for me. My kids were getting out the door to school. I have a cold. But I think considering those things, I did OK on the interview with Jenn White and Michelle Harven of 1A. For the last question, I had to sneeze and my mind went blank thinking of nothing other than “Do NOT sneeze into this microphone on live radio.” So I did what politicians do: Go to your main talking point. Ugh. Embarrassing. Still, it was fun in that “type II fun” kind of way. 10/10 type-II fun and would do it again.

https://the1a.org/segments/how-safe-and-effective-are-mental-health-apps/

Thank you to the host, Jenn, producer, Michelle, and the engineer Michael (I hope I got your name right I should have written it down) for all their help. Also, thank you, Dr. John Torous, director of digital psychology, Beth Israel Deaconess Medical Center and Haesue Jo, head of clinical operations, Better Help, for being there to talk about this topic. I appreciated both of your contributions and insights.

The big theme

I didn’t know going into it that I would keep coming back to a singular theme. It ended up being about how any organization providing healthcare must infuse themselves with the ethical codes of clinicians, to ensure high quality and safety. The problems I see coming out of MH app companies all seem to be tied to business folks and investors driving them rather than clinicians, thus leading to privacy and quality problems.

The notes

I don’t do a lot of podcast appearances or live radio so I prepared a bit more than I usually would for conversations, not knowing what questions would be asked. I pasted my notes below.

Power balance between clinical and business at executive level on down. 

Differentiation MH apps vs. private practice therapy

  • Big difference is ownership: Traditional therapy practices are therapist-owned and therefore, their owners are subject to the ethical codes of their professions. 

  • Tech companies are investor-owned and so the folks managing them are not subject to the ethical codes of professionals which include privacy standards above and beyond HIPAA. 

  • Secondary gain prohibitions - can’t use clinical relationships for other personal gain. 

  • Most therapists offer online therapy via video, usually conforming with CPT codes: 30, 45, 60 minute therapy. Some offer text/phone calls.

  • Illustrative example: "Break the glass." You can’t just access charts in medical environments.

Quality problem:

  • There aren't enough therapists out there to serve current demand. 

  • These therapists are ethically obligated to provide high quality care and to only take on as many clients as they can while maintaining their professional integrity. Admin help doesn't expand that number of clients significantly. 

  • Leveraging “free time” of clinicians to make an extra buck isn’t providing high quality clinical care. It’s distracted care. It’s unethical care. 

  • Tech companies aren't contributing to the supply of highly trained clinicians. 

  • Therapists ETHICALLY have the same obligation to a texting therapy client as to every face-to-face client they see. But how can you maintain high quality when seeing a ton of people?

  • Companies like BetterHelp pay on performance variables like word count and to provide quality care, therapists may have to provide fewer or more words than is in this target range. Perverse incentives like this impact quality. 

Privacy problems: 

  • No one understands how these companies are maintaining a barrier between clinical data and other commercial data. Their privacy policies allow a TON of use even as they may deny using it in that way. 

  • "Need to know" standard - you never look at information you don't need in clinical practice. Do MH tech companies abide by this? 

  • Are you using data in ways that are outside the ethical codes of the ACA, APA, NASW, etc? Secondary gain? 

Bullying problem:

  • Therapist non-disclosure agreements are tight, thus preventing many from sharing what it’s really like to work at these companies. 

  • Cease and desist letters to loud critics despite them sharing verifiable facts or opinions based on facts. 

Acceptable use/solutions

  • Minimize confusion between individual therapy and apps. If it’s just an app, there’s no concern about this being “therapy.”

  • Self-directed mental health activities can be a great way to help some folks get interactive psychoeducation resources and I think that's the best use of technology in mental health. As a clinician, I would LOVE to give my clients high quality homework they can do using their phone. 

  • Chatbots who can provide immediate tips and mental health resources (with transparency on AI driving the process) would help expand their reach without relying on a limited labor pool. They can refer to live clinicians when appropriate. 

  • If any company is doing it right, I think Lyra Health is best-in-class and the only company I've personally contracted with and with whom I maintain a relationship. 

  • For online tools, the Department of Defense has some great openly-available tools for the public. 

  • Privacy still an issue.

Move fast and break things is NOT a good idea in healthcare.

Cerebral scandal and thoughts:

It's a legitimate concern to keep scheduled drug prescribing open with tele. Things like suboxone treatment for opiate use disorder NEED to be prescribed via telehealth due to limited providers with the suboxone endorsement on their DEA licenses in rural areas. Additionally, these drugs aren't bad drugs. They're just drugs that have to be prescribed very carefully. If clinicians lean into their ethical codes and are incentivised for careful clinical care, they can keep prescribing them but if the incentives drive them toward looser prescribing, the system is a mess.

Previous
Previous

I got disinvited from my kids’ career fair and I’m still sad about it

Next
Next

NOCD: Helpers or doing bad things with your data?