AI chatbots cannot replace human interactions in the pursuit of more inclusive mental healthcare

What will the future of mental healthcare look like for those who currently fall through the gaps? There is hope that AI chatbots will meet a rising demand on healthcare systems to provide care to meet the shadow pandemic in mental health. Chatbots are viewed as improving efficiency, affordability,...

Full description

Saved in:
Bibliographic Details
Main Authors: Julia E.H. Brown (Author), Jodi Halpern (Author)
Format: Book
Published: Elsevier, 2021-12-01T00:00:00Z.
Subjects:
Online Access:Connect to this object online.
Tags: Add Tag
No Tags, Be the first to tag this record!

MARC

LEADER 00000 am a22000003u 4500
001 doaj_9b557d4e5a5c4f33aa18b93ee1c5be5f
042 |a dc 
100 1 0 |a Julia E.H. Brown  |e author 
700 1 0 |a Jodi Halpern  |e author 
245 0 0 |a AI chatbots cannot replace human interactions in the pursuit of more inclusive mental healthcare 
260 |b Elsevier,   |c 2021-12-01T00:00:00Z. 
500 |a 2666-5603 
500 |a 10.1016/j.ssmmh.2021.100017 
520 |a What will the future of mental healthcare look like for those who currently fall through the gaps? There is hope that AI chatbots will meet a rising demand on healthcare systems to provide care to meet the shadow pandemic in mental health. Chatbots are viewed as improving efficiency, affordability, convenience, and patient-driven access with an implicit assumption that this will improve health equity and social inclusion. There are, however, three critically therapeutic aspects of in-person outpatient mental healthcare that are overlooked in discussions about chatbot alternatives: 1) the way mental illness compromises an individual's motivational and self-advocacy capacities, especially for those who are socially marginalized; 2) the embodied nature of empathic communication during any clinical encounter that involves attending to complex non-verbal cues; and 3) how social connections provided by in-person clinics provide indirect social benefits that are not part of a clinical checklist. These three challenges entail corresponding ethical risks of not meeting the obligation to respect patients as persons, to provide empathic care as part of beneficence, and to provide care inclusively to meet demands for fairness and justice. This short communication makes the case for why humans, not chatbots, should be available as first-line mental healthcare providers. 
546 |a EN 
690 |a AI mental healthcare 
690 |a Clinical empathy 
690 |a Self-advocacy 
690 |a social inclusion 
690 |a chatbots 
690 |a Mental healing 
690 |a RZ400-408 
690 |a Public aspects of medicine 
690 |a RA1-1270 
655 7 |a article  |2 local 
786 0 |n SSM - Mental Health, Vol 1, Iss , Pp 100017- (2021) 
787 0 |n http://www.sciencedirect.com/science/article/pii/S2666560321000177 
787 0 |n https://doaj.org/toc/2666-5603 
856 4 1 |u https://doaj.org/article/9b557d4e5a5c4f33aa18b93ee1c5be5f  |z Connect to this object online.