AI and Empathy

All this AI stuff is bending my mind.

It’s been around for quite a while, but the experience with ChatGPT has profoundly changed our perceptions, feelings, and expectations about it.

We wonder what human skills and tasks will become obsolete as machines are able to perform some so much better, faster, and cheaper.

People have suggested that AI won’t be able to replicate human empathy.

I wonder.

Based on our experiences with ChatGPT, it’s easy to imagine a human-like deepfake avatar that has learned to use the language of active listening.  It wouldn’t be very convincing if it sounds like HAL in the 1968 film, 2001:  A Space Odyssey.

It’s also easy to imagine, however, that future AI systems will be able to mimic non-verbal communication of vocal pitch, expressiveness, volume, fluency, engagement, mirroring, and emphasis, as well as body language involving posture, head movement, hand movement, eye gaze, and facial expression.

So avatars could look and sound like Ava in the fabulous 2015 film, Ex Machina (which apparently is available on various streaming services).  Take a look at this trailer.

Will humans be able to recognize that avatars are the creatures that are performing seemingly empathetic behavior?  Even if we do, will we care?

Of course, many humans are not great listeners, so avatars wouldn’t compete with some standard of perfection.  Even someone as sensitive as New York Times columnist David Brooks struggled to listen well to a friend with depression.  I described in this in a post about “really listening,” which is practically necessary to feel and express empathy.

If a customer service avatar sounds really empathetic when “talking” with you about your problem, how would you feel?  Is the avatar really empathetic?  Why or why not?

700 words.

5 thoughts on “AI and Empathy”

  1. Hey all! There is a huge chasm between mimicking empathetic behaviors and actually experiencing empathy. It may be nice for us if technologies exhibit empathetic behaviors–maybe we could even experience personal growth or aha moments as a result of these interactions–but these technologies are not “empathizing” with us. The superficial behaviors of empathy, whether performed by an AI or a person, can hardly approach something like the empathetic and compassionate connection that one might have in deep conversation with friends, therapists, and loved ones.

    1. Hi Jen.

      I agree that AI-simulated empathy is not real empathy.

      I wrote this post reflecting on suggestions that AI would not be able to replace jobs requiring empathy.

      I’m still not sure. I can imagine that in the future, bots could simulate empathetic behavior to a significant degree.

      I wonder if people generally would recognize AI-simulated empathy and care if they deal with such bots, especially if the bots effectively solve their problems.

      I would like to think that people generally would notice and be turned off by AI-simulated empathy.

      But it’s impossible to really know. This requires speculation about how AI will be developed perhaps 5, 10, or 20 years from now – and how people will react as AI becomes increasingly sophisticated and we become acclimated to AI being increasingly integrated into many aspects of our daily lives.

      What do you all think?

      1. I recommend Meghan O’Gieblyn’s book: God, Human, Animal, Machine: Technology, Metaphor, and the Search for Meaning. She writes so beautifully and thoughtfully what it means to be human (and whether that could be replicated in technology) and why questions of humanity, consciousness, empathy, etc. are so difficult for us to grapple with.

  2. If you have any doubts about technology’s eventual ability to convincingly mimic empathy, I’d suggest listening to the Babbage podcast on February 8, 2023. (Babbage is a science and technology podcast by the Economist magazine.) In it, host Alok Jha interviews Ameca, a robot created by Engineered Arts in the U.K. I’d suggest searching Ameca on YouTube as well, to get the full effect. In short, Engineered Arts focuses on the dynamics of human-robot communications and has developed a robot with a human-ish face and hands, making it capable of facial expressions and gestures as well as using pitch and tone in its voice. It’s astonishing but not perfect. Yet.

  3. It seems, going forward, empathy wouldn’t remain an exclusive domain of humen. In fact, empathy apparently is one quality appears to be easiest to adopt. The customer service arena is a good mixing bowl of customer interactions, analysis of the demand-desire-design matrix in parallel with customer psychology, application of technology and innovation. The customer service executive phone call came to me yesterday, expressing a genuine tone of apology and admitted that till now no service to my water filter machine was rendered at all under the presently active annual maintenance service contract though the service period is about to elapse. Such a gesture could hardly be a human expression.

Leave a Reply to Sarbari Cancel reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.