Giving bots personality is counter-productive

Conversational AI is the biggest thing to hit C/X in 25 years, we need to make sure that we introduce it in the right way to deliver to our customers and protect our staff.

Customers need answers and they need them now. They need to renew their policy, they need to upgrade their package or transfer out. Handling customer queries well has long been the heartbeat of any successful business.

For year companies have been looking to drive self-service without damaging service quality. There have been ATM’s, IVR’s and apps. These have had some effect but they tended to add new channels without reducing the need for human beings to handle requests. But now things are changing fast. Artificial intelligence systems are finally starting to deliver. I read recently that a “chatbot” deployed in an insurance company in 2 days had displaced 11% of inbound traffic. This type of result would have taken a 9 month project if this business result was attempted just 5 years ago.

All the larger cloud companies are offering natural language query infrastructure. Microsoft has, Amazon has technology based on Alexa and Google has natural language recognition too. In the wake of these giants, smaller consulting companies, and companies with bespoke chat architectures are appearing with promises of rapid displacement of the need for human interactions.

Covid 19 has accelerated this need. Companies want resilience, and being able to switch on bots to handle queries reduces the risk that staff getting sick will impact the ability to service customers.

I get it, this is overall a great technology and customer service will feel very differently 5 years from now as many of the routine requests are handled without the need for any human interaction.

But we as an industry have two big problems with this. The first is our service staff. This wave of technology is the first that will truly have a serious impact on contact center jobs. Yes you have heard it all before, ATM’s were supposed to displace bankers but more back office jobs appeared, and IVR & apps just resulted in other request types coming in. But this IS different. Customers indicating what they want and getting an accurate completed action might not be new, but the ability to respond with a follow-up question and make further flexible queries through the same channel is. Customers will be less tempted to jump out to the phone or chat channel than they might be when stuck with an app.

We are going to have to think hard about what are the important characteristics of the remaining staff still dealing with queries in this conversational AI enabled world. Reflect for a moment on a time you reached out to a company for help. When things are difficult, we either want a solution with no fuss, or, perhaps we would like to get help from someone because of their very humanity. It matters to us that we know, that they know, we are in trouble. When a person types “I’m sorry you missed your flight and you are stuck in Heathrow”, it is different to receiving exactly the same words from a computer – so long as we know it is a human and we believe their sentiment is genuine.

So here is the first place where we need to be more careful. Daniel Dennett, probably the leading thinker on the social impact of machine intelligence today makes a big deal about the importance of not building systems which pretend to be interacting with you like a human: “When you are interacting with a computer, you should know you are interacting with a computer. Systems that deliberately conceal their shortcuts and gaps of incompetence should be deemed fraudulent, and their creators should go to jail for the crime of using an AI that impersonates a human being.” from Bacteria to Bach & back.

When we build cute chatbots, we are undermining our service staffs ability to offer empathy. If our computers are rolling out fake “I’m so sorry” or “I apologise”, then when our staff use these very human and empathy building terms they count for less. Our front-line staffs ability to offer empathy is becoming their most important skill, we should not undermine them by creating difficult contexts.

What about customers though, don’t they like a more humanlike interaction? There is plenty of material on the internet to say they do. In fact you can download a chatbot therapist developed in Harvard that claims to have good data that people that interact with it have fewer depressive episodes!

My claim is that customer do not like fake politeness, especially when dealing with standard service interactions. Who likes an obsequious ATM? As this technology becomes more prevalent and more effective, the chance of a customer mistaking an anthropomorphized bot for a human, and the resulting feelings of betrayal and even disgust, will result in the deep distrust that Dennett warns about becoming all too real. This risk is not clear yet, but it is coming fast and you can deal with it now without too much cost as you start to set up your conversational AI programmes.

I have 5 suggestions about how to think about the development of conversational AI for your company.

1. Stop calling this technology “bots”, if you anthropomorphise at the beginning, it is hard to get out of the habit. These are conversational AI systems (con-AI’s)

2. Involve your front-line staff in the engagement. Deploy your first con-AI as an internal system to support your team-leaders and staff. Then involve them in choosing the transactions to displace and engage with them to train customers about the new capabilities as they come online. Staff can be partners or adversaries in this change.

3. This is a new technology, but it does not have to be a new application. Customers can make requests through email or text and interact with your new capabilities equally well. In fact doing things this way further reduces the temptation to build fake human impersonation.

4. Be clear with customers and staff about your communications standards for con-AI’s. If a customer receives a “please” or some sympathy, can they be sure that comes from a human? Spend some time to get these standards clear, they are a future differentiator for your company.

5.  As you deploy, you will get new types of information about what customers really want. Find ways to feed this into your executive team and the product design team.


Bots are not people, its insulting to pretend they are.

Posted on 01/06/2020 by Cormac Murphy

Copyright © 2024 Ennovate Consulting | All rights reserved Website by Ambient Project Web Design Dublin