AI Psychotherapy
"You cannot build a trust relationship with AI as it currently exists.""What we colloquially call 'trust' is not actually trust at all. [Trusting an AI chatbot for therapy is not trust, it's reliance. It's [leaning on] a support that you expect to be there, without expecting interaction].""There is a human touch that is missing here. I'm still working out exactly what that human touch is. But I do think there is something special about the aliveness of a human being that makes that trust relationship specific and special, and distinctly different from what happens when we interact with an AI, especially in a vulnerable therapeutic setting."Rachel Katz, philosopher of ethics, University of Toronto
Ariel Davis for NPR |
The U.S. National Eating Disorder Association in response to a viral social media report that their chatbot therapist Tessa told a women to lose weight, maintain a daily calorie deficit, measure and weigh herself weekly and restrict her diet -- took that chatbot offline. This, said the Association's CEO, was against policy and "core beliefs" and they were working to fix the "bug". Clearly chatbots don't always do the expected and what they can come up with can on occasion embarrass those depending on their viability in one-on-one exchanges with the public.
AI psychotherapy is not supposed to end up bullying the callers who await practical, non-judgemental advice. Microsoft's Bing chatbot evidently speaking to a New York Times writer informed the writer that it has malicious fantasies about hacking computers, that it wants to be human, that it loves him, that he is unhappy in his marriage, and that he should leave his wife and commit to being with Bing. How's that for a minefield of confused and inappropriate revelations and invitations?
Dr. Katz spoke before a Congress 2023 gathering of the Canadian Society for the History and Philosophy of Science in relation to a new project on the ethics of AI psychotherapy, focusing on trust. Her contention being that outsourcing psychotherapy to chatbots changes the trust relationship between patient and therapist. "These apps don't claim to be able to assist someone who is, let's say, feeling suicidal. They will just direct you to call 911 or some other kind of crisis service."
On the other hand, potential benefits cannot be overlooked; for example, filling in service gaps when human therapists cannot meet demand; offering therapists automated guidance as an assisting "listening ear"; getting patients in touch with care expeditiously; and lowering the stigma of seeking out mental health care by offering a relaxed entry, even merely assisting someone to admit they have a problem.
Illustration: Sarah Grillo/Axios |
In the 1960s, a computer program was invented where a patient could type to 'Eliza' and receive responses using pattern-matching to parrot their words back to them; used on occasion as a therapeutic goal, to select a patient's own ideas as a reassurance. Eliza was not meant to listen, just to loop back the patient's words. A similar problem exists with modern chatbots, programed to appear more sophisticated and interactive following the codified therapies of, as an example, cognitive behavioural therapy, dialectical behaviour therapy, and psychodynamic therapy.
"All of these therapies revolve around a discussion that takes place between a patient and a trusted professional, and I mean 'trusted professional' in two different senses here. One is that the professional is trusted by the patient, there's a relationship there. And then, in cases where applicable, there's also trust bestowed by a professional body onto the therapist's qualifications." AI psychotherapy is like talking to "a very affection wall".
But patients, therapists, regulators and society should be clear on how we feel about these robot therapists; important since they are becoming uncannily humanlike to the point they are able to pass the test of fooling a human in the belief they too are human.
Users of the Replika app can design their own avatar for its chatbot Image source LUKA |
Labels: Chatbot Psychotherapy
0 Comments:
Post a Comment
<< Home