We found a match
Your institution may have access to this item. Find your institution then sign in to continue.
- Title
Does a lack of emotions make chatbots unfit to be psychotherapists?
- Authors
Rahsepar Meadi, Mehrdad; Bernstein, Justin S.; Batelaan, Neeltje; van Balkom, Anton J. L. M.; Metselaar, Suzanne
- Abstract
Mental health chatbots (MHCBs) designed to support individuals in coping with mental health issues are rapidly advancing. Currently, these MHCBs are predominantly used in commercial rather than clinical contexts, but this might change soon. The question is whether this use is ethically desirable. This paper addresses a critical yet understudied concern: assuming that MHCBs cannot have genuine emotions, how this assumption may affect psychotherapy, and consequently the quality of treatment outcomes. We argue that if MHCBs lack emotions, they cannot have genuine (affective) empathy or utilise countertransference. Consequently, this gives reason to worry that MHCBs are (a) more liable to harm and (b) less likely to benefit patients than human therapists. We discuss some responses to this worry and conclude that further empirical research is necessary to determine whether these worries are valid. We conclude that, even if these worries are valid, it does not mean that we should never use MHCBs. By discussing the broader ethical debate on the clinical use of chatbots, we point towards how further research can help us establish ethical boundaries for how we should use mental health chatbots.
- Subjects
INSTANT messaging; MOBILE apps; EMPATHY; MENTAL health; COUNTERTRANSFERENCE (Psychology); DEBATE; MEDICAL quality control; ARTIFICIAL intelligence; MENTAL illness; MEDICAL care; EMOTIONS; PSYCHOLOGICAL adaptation; COMMUNICATION; SOCIAL support; TEXT messages; USER interfaces
- Publication
Bioethics, 2024, Vol 38, Issue 6, p503
- ISSN
0269-9702
- Publication type
Article
- DOI
10.1111/bioe.13299