Contact Center Solutions Featured Article

Automated Empathy in the Contact Center of the Future

April 24, 2017

Customer service is one of the biggest competitive differentiators for today’s businesses. At the same time, advances in artificial intelligence (AI) and automation are supplementing and replacing some traditionally human roles in the contact center. Can AI coexist with human contact center agents to create a new level of customer satisfaction and experience?


While customer service is definitely improving, customers have not yet reached a point where they hold call and contact centers in high esteem. Too much time wasted on hold, being transferred to different departments and agents and overall frustration and negative experiences have all contributed to a less than glowing characterization of the contact center. The cloud and AI could change all that, however, leading to better customer interactions, higher overall levels of service and a more positive view of the contact center in general.

Make no mistake – AI is a poor substitute for real human interaction, and chatbots won’t be replacing human contact center agents any time soon. "If you ever used AI bots, [such as] Siri, you know how often they get simple questions wrong — now imagine letting it deal with your customers directly," said Gadi Shamia, COO at Talkdesk, a cloud-based contact center provider, in a recent interview with CIODive. Rather AI will be used to route customers to the best available human agent, significantly improving the customer experience. AI bots can also quickly reference customer history and use that information to route the customer, bypassing the IVR.

The contact center of the future will also employ AI to automate empathy. While today’s popular bots like Siri and Alexa are pretty good at understanding speech, they fail when it comes to interpreting tone and emotion. And that’s a problem because a good chunk of customers are unhappy or even angry when they make a call to a contact center. Research firm Mattersight analyzed more than 118,000 customer service calls and found that customers exhibited emotional signs of anger 54 percent of the time during the first half of a call, while displaying signs of sadness and fear more than half of the time as well.

The next step for AI designers is to use speech recognition to detect and analyze emotions so that bots can exhibit empathy. That’s no easy feat, but Mattersight said NASA-based personality models and speech recognition algorithms may be used to eventually analyze speech tone, tempo, grammar and syntax to better handle emotional customers.

"In the future, AI will be plugged into all knowledge sources, so it will continuously learn based on a customer’s past activity — what they were doing on the website, what have you previously called about, whether a claim just denied or a late charge just put on a bill, etc.," Andy Traba, vice president of data science at Mattersight, told CIODive. "What this will allow is that when you call into an enterprise, an intelligent routing engine will predict the intent behind the call and then determine the appropriate routing treatment."





Home