Simple because it matters.
Simple because it matters.
Digitalisation & Technology, 19 December 2024
Mental illnesses are widespread: according to the German Association for Psychiatry, Psychotherapy and Psychosomatics, they affect 17.8 million people in Germany. Will artificial intelligence also revolutionise the treatment of mental illnesses?
Many people with mental illnesses need therapeutic help. However, finding a therapist can be a challenge – especially for regions outside of urban centres, where the German Federal Chamber of Psychotherapists has identified a serious shortage of psychotherapists.
Can artificial intelligence help here and step into the breach? Indeed, a great deal of research is currently being done into the possible uses of AI in the diagnosis and treatment of mental illnesses. Various therapeutic chatbots are already in use. But let's take this in order.
AI is good at analysing large and complex amounts of data. Data can also be collected on mental illnesses using questionnaires, self-reports, written therapy sessions, video and audio recordings. But smartphones and smartwatches can also record patterns of interaction and body data. All of this data contains information about the state of mental health.
It makes sense to train AI models with such data and then use them to diagnose mental illnesses. However, it is not only fully developed clinical pictures that are identified. In principle, AI could detect the very beginnings or even the mere risk of an illness. Some examples:
‘Limbic Access’ also represents another strength of AI: it is very good at interacting with people. AI-controlled chatbots are probably the best-known applications of artificial intelligence. A special feature of mental illnesses is that their symptoms usually also show in what and how someone speaks; therapies also largely involve conversations. Shouldn't AI then find a large field of application here? Could AI ultimately replace the human therapist?
Chatbots in the form of medical apps have actually been used for some time in prevention and treatment. Originally, such apps had little to do with artificial intelligence. However, various applications are gradually being retrofitted with AI or redeveloped. Some of the many models are highlighted below:
The story of AI in the treatment of mental illness has only just begun – and further fields of application are on the horizon. For example, AI models can also support therapists in planning and evaluating therapies. In a project at the University of Basel, for example, artificial intelligence is analysing video sessions and calculating the probability of a patient dropping out of therapy.
A spectacular advance could be the combination of AI with virtual or augmented reality. For example, AI could create a customised, interactive virtual training environment for patients in which they could practice coping with everyday situations or overcoming fears.
So, are capacity problems in psychotherapy on the verge of being solved? Caution is advised – when it comes to mental illness, expectations of AI should not be set too high. It has limitations that have a serious impact on psychotherapy. AI does not really understand what people say, nor is it capable of genuine empathy. It basically proceeds statistically: ‘In the learning data, people with these language peculiarities had depression in 77% of cases.’ Or: ‘If a person does not respond to therapeutic suggestion A, there is a high probability from their medical history, after comparison with other medical histories, that they will respond to suggestion B’.
However, mental illnesses manifest themselves differently in each person. They are dynamic, change their appearance, and experience both progress and setbacks. Mere statistics do not do them justice. The empathetic understanding of therapists must therefore classify and put the AI findings into perspective.
Using chatbots without the support of a professionally qualified person is a stopgap. The AI bot cannot replace the therapist – but it can help people who can't find a therapist or who are hesitant to seek treatment.
Incidentally, in a British study, ChatGPT-4 was tasked with assessing whether the author was suicidal based on short texts. At the same time, experienced psychotherapists were also consulted. The AI came to the same conclusions as the humans. However, since an AI does not reveal how it arrived at its assessment, human observers cannot judge how reliable the qualities of the model actually are. There is much to suggest that, in the long term, AI in the treatment of mental illnesses will not go beyond the role of assistant, merely supporting human decisions.
Text: Thorsten Kleinschmidt
Your opinion
If you would like to share your opinion on this topic with us, please send us a message to: next@ergo.de