In 2012, a large scale Italian study had found a close relationship between physician empathy and clinical outcomes of diabetic patients. This natural gesture was also found to be helpful in reducing severity of patients down with common cold.

Apart from empathy, the mere ability to recognize emotions is believed to aid doctors in the early detection of disorders like Parkinson’s, or to prevent suicide and to support autism. Different from sympathy, perceiving pain in others is a two-step process: first is the need to emotionally share the pain with the pain who is actually feeling the pain and second, making a cognitive appraisal of pain.

Because of the mental burden, taking away emotions may enable doctors to make more profound judgment of a situation or to perform a procedure. On the other hand, nurses may find hiding their emotions as a form to cope with work stress.

Human like to overcome a concern with another concern. As we continue to explore whether it is necessary for medical professionals to incorporate emotions into their practice and how much should they embrace, we begin to think our non-human counterpart.

Artificial Empathy is not empathy

As mentioned, empathy and sympathy are different facets. We have mirror neurons which “simulate in ourselves the emotions we observe in others. It’s like understanding from within”, social neuroscientist Pascal Molenberghs told ABC news. We do not need to draw clear lines between each and every emotion we experience but AI requires that.

In order for AI to “feel” the way we do, we have to dissect our emotions, turn them into codes, and feed into them as algorithm. In this way, what AI does is to identify and express that we are “in pain” or “feeling happy”, but in itself, AI is not “feeling” anything.

Like what the founder of Google AI Empathy lab Danielle Krettek told Vice, “the real empathic connection, and the idea of being self-aware – I think is a uniquely human thing.” So what AI achieves is a high-level imitation. After all, recognition does not mean understanding.

Not if AI remains assistive

Most AI have very targeted training and owns very specific set of skills, others like chatbots, who have artificial general intelligence, are not capable to accomplish challenging tasks. Thus far, there is still limited knowledge whether AI can be turned into AGI or vice versa.

Will a really skillful robot capable to distinguish and reciprocate human emotions, be able to have the kind of medical know-how which a senior specialist has? Or will the AI be just like a human doctor, finding it overwhelming to juggle between expressing empathy and providing medical solutions?

“The good physician treats the disease; the great physician treats the patient who has the disease”. We know that AI has the potential to achieve the former. For the latter, perhaps it’s best for AI to alleviate doctors with all the administrative tasks and to cover duties in areas where there is a scarcity in medical professional, so that doctors can garner more time to show empathy.


Author Bio

Hazel Tang A science writer with data background and an interest in the current affair, culture, and arts; a no-med from an (almost) all-med family. Follow on Twitter.