The promise of AI collides with the reality of care
The central claim in Ryan McCarthy’s essay is not that artificial intelligence has no place in medicine, but that health care cannot be reduced to information processing without losing something essential. The current wave of AI marketing often suggests that better pattern recognition, faster responses and wider access will naturally improve care. McCarthy pushes back against that logic by returning to a simpler truth: medicine is not only about answers, but about suffering, trust and interpretation. A patient does not merely deliver data. A patient tells a story, often uncertainly, painfully and in fragments.
Table of Contents
That distinction matters because the most consequential moments in care are rarely transactional. They unfold when a person tries to describe grief, fear, shame or confusion to another person capable of receiving it with patience and moral seriousness. The therapeutic exchange is not just the transfer of symptoms but the formation of meaning, and McCarthy argues that this cannot be replicated by a machine trained to predict language patterns, however fluent that machine may appear.
Why empathy is not a technical feature
McCarthy’s notion of “organic intelligence” serves as more than a rhetorical contrast to artificial intelligence. It names the lived human capacities that make medicine possible: memory, perception, emotional recognition, moral judgement and the ability to respond without a script. In his account, the doctor-patient relationship remains durable not because it is efficient, but because it is relational. Patients do not simply need information about blood pressure, diabetes or cancer risk. They also need someone who can read silence, hesitation, contradiction and fear.
That is where the argument becomes especially sharp. Large language models may generate plausible language, but plausibility is not the same as presence. A patient speaking for the first time about trauma, addiction, postpartum depression or a dying parent is not following a predictable linguistic path. Often, the speaker discovers the meaning of the experience while speaking. McCarthy’s point is that care emerges from that unpredictability. Human empathy does not just answer distress; it accompanies it.
The corporate logic behind medical AI deserves more scrutiny
The essay also widens the frame from bedside care to political economy. McCarthy is plainly sceptical that the corporations building AI systems should be trusted as custodians of intimate medical life. His concern is not abstract. As millions of people begin asking AI platforms about symptoms, medications, mental health or reproductive issues, intensely personal information is flowing into systems owned by companies whose incentives are commercial rather than clinical. That creates a profound mismatch between the language of care and the logic of data extraction.
This critique becomes more pointed in the context of public policy. McCarthy highlights the growing willingness of officials and industry figures to present AI avatars and robotic systems as practical substitutes for scarce health services, particularly in underserved regions. His objection is not only technological but social. When remote, poorer or historically neglected communities are told that automated care is the future, the risk is that innovation becomes a new vocabulary for lowered expectations. In that scenario, AI does not close inequalities. It rationalises them.
Efficiency cannot answer the moral question
What gives the piece its force is that it refuses to confuse technical capability with moral adequacy. AI may eventually support diagnostics, administrative work or aspects of triage. But McCarthy insists that the heart of medicine lies elsewhere: in the unscripted encounter between one vulnerable person and another person willing to stay with that vulnerability. Care is not credible merely because it is available, scalable or fast. It is credible when it conveys concern that feels real to the person receiving it.
That is the broader significance of the essay. In an era increasingly tempted to treat empathy as a feature that can be simulated, McCarthy argues that health care should resist that downgrade. Machines may become powerful assistants, but they do not suffer, do not accompany and do not care in the human sense of the word. Medicine can absorb new tools without surrendering its centre. That centre, his essay argues, remains the fragile and irreplaceable encounter between human beings.
Author:
Jan Bielik
CEO & Founder of Webiano Digital & Marketing Agency

Source: AI in health care: Why artificial intelligence cannot replace human empathy



