This summer, my Facebook account was permanently “disabled.” I had been assisting another mother in navigating a medical decision for her child, a young adult affected by a degenerative condition. It wasn’t medical guidance. It was compassion, stemming from my own experiences as a medical mother, a certified coach, and years of teaching courageous communication in a medical school.
Meta AI flagged the interaction as a violation of Community Standards concerning child abuse. My words were assessed by an algorithm that failed to differentiate between exploitation and support. My account, along with years of advocacy, caregiving, and connection, vanished overnight.
It took six weeks, numerous appeals, and a friend-of-a-friend within the company to speak to a human who confirmed it had been an error.
By the time my account was reinstated, something within me had changed.
**Algorithms lack the ability to comprehend context or care.**
For families like mine (parents managing rare diseases, disabilities, or chronic illnesses), online communities have become essential resources. This is where we turn when the rest of the world is asleep. I can share in a support group in the middle of the night and connect with another parent across the country, or around the globe, who understands. Someone who doesn’t require the background to respond with empathy. Peer-to-peer communities, often hosted on platforms such as Facebook and Instagram, have silently become integral to our public health infrastructure. They help alleviate isolation, diminish caregiver stress, and enhance engagement with care plans. These very spaces are currently being scrutinized by algorithms that flag “dangerous content.” When an AI system cannot discern between misinformation and a parent expressing fear or uncertainty, it can mute the crucial support that families rely on. When discussions about fear, prognosis, or end-of-life care are automatically labeled as suspicious, taken out of context and removed from community, we risk losing the ability to discuss the most challenging aspects of medicine altogether.
**When we suppress words, we lose individuals.**
There’s a subtle irony in this situation. Medicine already faces challenges with language: the terms we shy away from, the voids that emerge around suffering, disability, and uncertainty.
Now, those voids are being automated.
If algorithms begin dictating which stories are appropriate to share, we risk forfeiting the environments where caregivers and families process what cannot be amended.
These discussions are not trivial. They are vital to healing.
Clinicians must care about where families are finding support and how those environments are being shaped. Because if families cannot discuss their fears online, in settings created for comfort and connection, they may stop discussing it entirely. Especially in the clinic.
**Engagement is paramount**
Doctors are concerned about misinformation online. And rightly so. Social media is filled with false expertise and blatant fabrication. But the answer isn’t censorship. It’s engagement. Families seldom turn to Facebook out of distrust for their doctors. They join because they crave to be heard. They seek someone to accompany them in the uncertainty. When medicine withdraws from the conversation, we allow fear to flourish in silence.
Health care professionals must be aware of and involved in these digital environments. To model, not to surveil. To collaborate on what respectful, evidence-based, compassionate dialogue can look like. Doctors, nurses, allied health professionals, and educators can significantly contribute to nurturing healthy peer-to-peer support networks. The same empathy shown at the bedside can be extended to the comment section.
Connection cannot be automated. Listening cannot be outsourced.
**We need to cultivate gentler communities.**
As a mother-scholar, I navigate the dual realms of clinical education and clinical guidance. Through my courses and workshops, I remind health care professionals that engagement is not a distraction from professionalism. It is part of it. I’m reconstructing those gentler spaces through my Substack, The Soft Bulletin, and through my coaching endeavors with clinicians and caregivers. I’m not abandoning connection. I’m restoring it. What I desire, and what I believe many clinicians and caregivers desire, is a kinder type of community. One that treasures curiosity over compliance, listening over labeling, and conversation over control.
As health care professionals, we must inquire: Where are our patients finding connection? What occurs when the algorithms governing those spaces determine their words are perilous?
If we wish to safeguard mental health, trust, and humanity in medicine, we must continue the conversation.
Because engagement is vital. Healing does not occur in isolation. It unfolds in dialogue. Even, and especially, the challenging conversations.
*Kathleen Muldoon is a certified coach dedicated to empowering authenticity and humanity in health care. She is a professor in the College of Graduate Studies at Midwestern University – Glendale, where she pioneered innovative courses such as humanity in medicine, medical improv, and narrative medicine. An award-winning educator, Dr. Muldoon was named the 2023 National Educator of the Year by the Student Osteopathic Medical Association. Her personal experiences with disability sparked a deep interest in communication science and public health.*