When Humanizing Customer Service Chatbots Might Backfire
Download
To Suggested Citation
Suggested Citation
Hadi, R. (2019). When Humanizing Customer Service Chatbots Might Backfire. NIM Marketing Intelligence Review, 11(2), 30-35. DOI: https://doi.org/10.2478/nimmir-2019-0013

Register for our Newsletter

NIM Marketing Intelligence Review – AI and the Machine Age of Marketing

When Humanizing Customer Service Chatbots Might Backfire

Authors

  • Rhonda Hadi, Associate Professor of Marketing, Saïd Business School, University of Oxford, United Kingdom, Rhonda.Hadi@sbs.ox.ac.uk
Download Article

Rhonda Hadi 

More and more companies are using chatbots in customer service. Customers interact with a machine instead of with a human employee. Many companies give these chatbots human traits through names, human-like appearances, a human voice, or even character descriptions. Intuitively, such a humanization strategy seems to be a good idea.  

Studies show, however, that the humanization of chatbots is perceived in a nuanced way and can also backfire. Especially in the context of customer complaints, human-like chatbots can intensify the negative reactions of angry customers because their performance is judged more critically than that of non-humanized chatbot variants.  Service managers should therefore consider very carefully whether, and in which situations, they should use humanized service chatbots. 

Authors

  • Rhonda Hadi, Associate Professor of Marketing, Saïd Business School, University of Oxford, United Kingdom, Rhonda.Hadi@sbs.ox.ac.uk
Share Publication
Suggested Citation
Hadi, R. (2019). When Humanizing Customer Service Chatbots Might Backfire. NIM Marketing Intelligence Review, 11(2), 30-35. DOI: https://doi.org/10.2478/nimmir-2019-0013


Other articles of the MIR issue “AI and the Machine Age of Marketing”

Here you can find more exciting articles of this issue.

To the entire issue

Latest issues

Scroll to top