The Pull Agency
Spring cleaning your brand - The five do’s and don’ts of a brand relaunch.
Have we reached ‘peak content?’
Heartificial empathy – Review of new book by Minter Dial
Blog / Culture

Heartificial empathy – Review of new book by Minter Dial

Pull MD Chris Bullick met with author and thought-leader Minter Dial to discuss his new book
“Would you like a drop?” Standing in the kitchen of Minter Dial’s London apartment I had mixed emotions when he waived a Bottle of Absolute Vodka under my nose.

At 10:30 on a February morning, I was torn between being a bit horrified and rather impressed. Even more so when Minter proceeded to take a long swig straight from the Bottle.

Minter likes a dramatic moment – it was of course water. He is a natural showman and everyone enjoyed his presentation at our AI: Empathy Ethics & Your Brand event at Microsoft HQ in August 2018.

So after enjoying Minter’s last book – Futureproof - co-authored with Caleb Storkey, I quickly acquired his latest book Heartificial Empathy – Putting Heart into Business & Artificial Intelligence.

At Pull we have a voracious appetite for all things AI. For me this was inspired by the seminal work Home Deus by Yuval Harari. Two key thoughts remained with me after reading that book. Firstly, that the impact of the arrival of AI will dwarf the impact of the digital revolution and the Internet. Secondly, that humans are driven by algorithms that have evolved over millennia. We have learnt in recent times to ‘trust these feelings’. However, AI will provide new algorithms that will be demonstrably superior to your own (largely instinctive) decision-making. Improbable though it might sound, he makes a very convincing case that Google will in the future be much more competent at picking (for instance) your life partner than you could be.
 

AI, EHTICS & EMPATHY

If this is correct, then there will be very important ethical and moral dimensions to AI. One of these – which Minter has made something of a speciality is empathy. The questions he explores are: First - what is empathy anyway? Where does it fit in with business? Can you learn it, and can you teach it? Secondly, how should we proceed when considering the role of empathy with machines?

These are the themes of Heartifical Empathy. Firstly Minter does an admirable job in exploring the anatomy of human empathy. Empathy seems a somewhat under-studied subject. Minter explores the distinction between empathy and sympathy, and drills down into cognitive empathy and emotional empathy (put simply, the former is an intellectual process whereas the latter could be described as ‘heartfelt. or what you might feel.) This is useful when you consider his main thrust – which is that empathetic businesses will be more successful and rewarding to be in than ones that aren’t.

An intriguing aspect that Minter raises in his book is declining empathy. He points to research which suggests that levels of empathy are falling. This was indicated in a 2010 study of college students which tracked self-declared levels of empathy since 1979. It finds that levels have fallen since then by some 40%. When I met Minter, I asked him about this as it surprised me. Surely in the age of social justice and equality for all this can’t be true? Could it be that expectations of behaviour today are driving people to rate their empathy levels as lower that people would have in the past? However, this seems unlikely as the research methodology was designed to avoid this.

We explored this when we met. He said he felt that there is perhaps an increasing awareness of the need for empathy. The drive for diversity for instance is somewhat dependent on empathy. If you don’t have empathy, you won’t understand the need for diversity.

This prompted me to try to look a bit deeper and I discovered this interesting article. Funnily enough it corroborated another point that Minter made about exercising our empathetic skills by reading quality fiction. The view expressed was that falling levels of empathy are being caused by the decline in group socialising, club membership and the like, and the decline in reading fiction. The former means we are less likely to rub up against people with whom we share perhaps only one key interest. The latter stunts the development of the critical faculty of seeing people as complex and diversely motivated, rather than simply either people we agree or disagree with or share a worldview with.

It strikes me that the proliferation of social media and the like-minded bubbles they provide, along with the replacement of group activity time by solo online time would also damage our ability to develop empathetic skills then.

Given Minter’s theory on the dependent relationship between diversity and empathy, the outlook is not good. People are actually experiencing less diversity, which is leading to less empathy – which will reduce the desire for diversity. A dangerous vicious circle.

Whatever the causes, the decline in self-reported empathetic behaviour is concerning.

Minter also comes to the conclusion that empathy is a skill that can be learned but not taught. I challenged this when we met. Surely, as a general rule something that can be learned can be taught? It’s an interesting question when you come to empathy. In his analysis of empathy Minter breaks down what empathy consist of. I couldn’t help feeling that each of these steps – which starts with active listening could be taught. I have certainly been to interpersonal communication skills coaching sessions that teach active listening for instance. The first rule of negotiation you will hear us – properly understand the other side’s point of view. Steven Covey taught is “Seek first to understand” in his 7 Habits of Successful People. Surely you can at least teach the components of empathy, given that granted you can always only take a horse to water?

I also discussed this with Minter when we met. I think that I was more optimistic that empathy can be taught than Minter – although his riposte – “Can you teach love” was pretty thought-provoking. But I had to agree that you could teach the knowledge, but you can’t teach the desire to change when it comes to acting empathetically – you have to be open to the idea.

Having dealt with human empathy in Heartificial Empathy Minter moves on to machines. I find this a fascinating if somewhat paradoxical subject. If empathy is inextricably linked to emotion and feelings, which it is generally accepted machines are a long way from experiencing, then how can a machine empathise?

I guess an easy answer is that it can emulate empathy without knowing that is what it is doing if programmed to do it.

I am old enough to remember the advent of AMTs or cash machines in the UK, although I was a child at the time. However, I do remember that the banks themselves were highly sceptical. Their research told them that customers would prefer to continue dealing with humans rather than holes in the wall. How wrong did that turn out to be? It makes me think that ATMs were probably the first empathetic machine. It did the simple thing you want it to do better than a human could, and with a little smart design of the interface (developed thankfully since the early days I would say) could be said to be empathetic in use. (I have to say American ATMs are less empathetic than European ones. The European ones tell you to take your card before you can take your money. This has led to me returning to a US-based ATM after realising I had taken the money but, as I wasn’t promoted to by the machine, not taking my card. I just walked back to the machine as it swallowed my card, disappearing forever in front of my eyes.)

In exploring the need for empathetic machines, Minter explores some interesting questions and ideas. One of them is the paradox that arguably the people who are best equipped for the challenge of developing AI algorithms – literally doing the coding, are less likely to be naturally equipped with high levels of empathy – the two things rarely go hand in hand. However, a nice spin-off ought to be that the focus on more empathetic machines will focus on empathy in general – which is really the core message of this book – that if we are going to create empathetic machines, we first need empathetic people and organisations.

So overall, this is a thought-provoking and recommended read. Just a final additional thought: To me the first rule of ethics and empathy for AI ought to be self-declaration. In his book Minter gives an interesting example of an interaction he had with Amazon where their customer care moved smoothly from a Bot to a human and back. We agreed that the language used by their chatbot (and the human!) was well-judged so that most people would know when they were talking to which, but I’m not sure this is always the case. 

Like many I was suitably impressed by Google’s AI assistant demo.  But what horrified me a bit when the assistant calls a hair salon to book an appointment, was that it is clearly pretending to be a human. That is OK until it all begins to go a bit wrong. Maybe one day we will just assume that all such calls will be made by Bots. But we will go through a slightly disorientating transition phase which reminds me a bit of the transition to autonomous cars. There will clearly be a period where Bots and people will need to negotiate passage together on the roads – that will be interesting. But with AI assistants for now, am I the only one thinking that AI emulating humans should be self-declaring: “Hello, I’m Google Assistant, I’m calling to book. . .”?
 

Posted 8 March 2019 by Chris Bullick

 

Console