As advanced artificial intelligence such as ChatGPT rises and is added to mental health applications, while they provide people with some comfort, they also raise new issues such as privacy and security. Image source: "Nature" website
【World Trend of Technological Innovation】
◎Our reporter Liu Xia
Since 2015, mental health app Koko has been working hard to provide support to those in need. When someone texts the app about feelings such as guilt or frustration, they receive a sympathetic response within minutes, along with some positive coping strategies.
The British "Nature" website pointed out in a recent report that there are thousands of mental health apps like Koko on the market, which is the result of 70 years of scientists trying to automate psychological treatment. Now, with the rise of advanced artificial intelligence (AI) such as ChatGPT and being added to these applications, they provide people with some comfort while also raising new privacy and security issues.
Digital Psychological Counselor
There are only about 4 psychiatrists per 100,000 people globally, and in most low- and middle-income countries this number is even lower. Due to this factor, developers of smartphone apps have created a large number of programs that provide psychotherapy methods that can be taken with you. It is expected that 10,000 to 20,000 mobile mental health apps will be available for people to use globally in 2021.
Researchers have tried to digitize psychotherapy, and although there are many schools and methods of psychotherapy, they have ultimately focused on cognitive behavioral therapy (CBT). The UK National Health Service describes CBT as a type of talking therapy designed to help patients manage mental health conditions by changing the way they think and behave, helping patients reorganize negative thoughts into positive ones.
With the increasing popularity of smartphones and the rapid development of big data and AI, digital psychological intervention forms are also changing. Mental health products developed with mobile clients as carriers are gradually increasing, among which those based on intelligent chat robots are gradually increasing. Cognitive behavioral therapy has gradually become a more popular intervention method. There are many relatively mature AI mental health products abroad, such as Woebo, Tess and Wysa. The effectiveness of such products in interfering with depression has been verified by empirical research. For example, a Stanford University study found that people's symptoms of anxiety and depression decreased after two weeks of using Woebot.
"Show your talents in three aspects"
Broadly speaking, in the field of mental health, AI based on machine learning has "shown its talents" in three aspects.
First, doctors will use AI to analyze treatment interventions and fine-tune them. For example, these two applications are ieso and Lyssn. Scientists from the University of Washington developed Lyssn, which can analyze conversations by evaluating 55 criteria. Text-based treatment provider ieso, based in Cambridge, UK, identified the most effective interventions by analyzing more than 500,000 treatments and tracking outcomes. Stephen Freer, clinical director at Ieso, believes having this data could help therapists focus more on constructive treatment rather than small talk in future sessions.
The second role of AI is diagnosis. Many platforms, such as the REACHVET program for U.S. veterans, scan an individual’s medical records for red flags that may indicate they are self-harming or suicidal. John Toros, director of the Division of Digital Psychiatry at Harvard Medical School, said this diagnostic work may be the most promising application of AI in mental health.
The last and most important aspect is: fully digital therapists who use AI to directly guide therapy. This aspect may be closest to what CBT pioneers such as Stanford University psychiatrist Kenneth Colby hope for.
Privacy and security issues raise concerns
Nicholas Jacobson, a biomedical data scientist at Dartmouth College’s Center for Technology and Behavioral Health, said that although there are many mental health apps on the market, there is little evidence that some of them are useful. . The addition of large language models such as GPT-3 and the related chatbot ChatGPT has made many people even more worried.
Privacy is a primary consideration. In early March, BetterHelp was fined $7.8 million by the U.S. Federal Trade Commission over allegations it shared sensitive user information with advertisers. In late March, documents disclosed by telemedicine startup Cerebral showed that due to errors, the platform’s patient data had been shared with advertisers such as Yuanverse Platform Company, Google, and TikTok, and the personal health information of more than 3.1 million users was leaked. It may include mental health-related data.
Others are concerned about safety and legal liability. Earlier this year, a Belgian man took his own life after six weeks of conversations about the climate crisis with an AI chatbot called Eliza developed by ChaiResearch. The man's widow revealed to Belgian media that her husband would not have died without the conversation with the chatbot. ” Experts this month called for the creation of a new agency to oversee digital mental health tools.
"Nature" points out that although AI may have potential benefits in helping people achieve mental health, these therapeutic applications are still in their infancy and there are ethical dilemmas.
Source: Science and Technology Daily
The above is the detailed content of An AI psychotherapist in your pocket. For more information, please follow other related articles on the PHP Chinese website!