The number of apps targeting mental health has exploded in recent years. But experts have mixed feelings about the efficacy of these apps — aimed at helping a variety of issues from mild anxiety to PTSD — and the privacy policies that come with them.
While apps aimed at helping mental health began popping up before 2020, the mental health app market has exploded since the onset of the COVID-19 pandemic, partially due to a widespread decline in the nation’s mental health.
The number of Americans between ages 18 and 44 that received mental health treatment, including therapy or medication, went up from 18.5 percent to more than 23 percent between 2019 and 2021, according to data from the U.S. Centers for Disease Control and Prevention.
Now, there are at least 15,000 mental health apps on the market, the majority of which are unregulated, said Stephen Schueller, an associate professor at the University of California, Irvine and the executive director of One Mind PsyberGuide, a nonprofit that reviews mental health apps.
An increasing number of apps are being designed to connect people with trained mental health professionals to give real-time, real-world access to support. But the bad news is that many of these apps are not evidence-based and the FDA only regulates a small number.
The FDA only regulates apps that function as medical devices which includes treatment apps but not wellness apps. And it can take years for an app to be approved or to have a specific function approved., according to Schueller.
Schueller noted that many of the mental health apps are not heavily used. But the number of apps that are “high-quality” or have scientific evidence supporting them are in the extreme minority.
There are only 450 to 600 evidence-based apps on the market, or about 3 percent of all mental health apps, according to Schueller.
Many non-evidence-based apps targeting anxiety for example will feature “supportive content,” Muniya Khanna, founder and director of The OCD & Anxiety Institute and chief digital officer at Lumate Health, told The Hill.
Supportive content can include relaxation videos or music aimed at calming or focusing the mind. But Khanna said an app trying to treat anxiety should include some means of allowing users to understand where their anxieties are coming from and how their thoughts are involved in maintaining their cycle of anxiety, she said.
“Most of the apps that are saying that they’re designed for anxiety would only have maybe a little bit of psychoeducation and not really any of the other things,” Khanna said. “The content is just not rich enough to be a standalone support tool.”
When paired with therapy, some evidence-based apps like Mindshift can actually be beneficial, said Anne Marie Albano, professor of medical psychology at Columbia University Medical Center.
“But apps are not meant to or shouldn’t replace therapy,” Albano said.
And if someone uses an app that isn’t evidence-based or even teletherapy that is not delivered at the intensity or the quality that they need, it can make mental health struggles worse.
“They can become more hopeless, they can become more stressed, things worsen over time,” Albano said. “And at the same time, they can lose hope that anything is going to make a difference.”
The lack of oversight over mental health apps means that some apps are not abiding by health-related privacy laws.
Mental health apps that connect users with a therapist or counselors are usually bound by state and federal health privacy laws that also control in-person therapy sessions, though some have run afoul of federal law.
In March, the online therapy and counseling service BetterHelp returned $7.8 million to customers as part of a settlement with the Federal Trade Commission after the company shared health data with companies including Facebook and Snapchat it has promised to keep private.
Other wellness apps that feature guided meditations, chatbots, or questionnaires often share or sell personal information to third-party businesses.
Over 62 percent of the top mental health apps were given a privacy warning label by The Mozilla Foundation’s*Privacy Not Included.
For people struggling with their mental health and who want to use an app, the American Psychological Association recommends searching to see if the app creator has published any research to show that their product works.
Something else to look for is to see if there is anyone on the company’s board or as part of a scientific advisory committee, said C. Vaile Wright, senior director of health care innovation in the practice directorate at the American Psychological Association.
Consumers should also look to see if the app’s company has privacy and security policies in place and know where user data is going.