Empathy and its role in AI “sanity” Part 1
So in the journey through the top 10 AI’s and robots, there was a recurring theme – the inability to connect to emotion. Some tried to fake it (Data), so got it wrong (Marvin the Paranoid Android). I there a problem with not being able to connect?
Lets look at this question in a different way. What are the disorders in people associated with a lack of empathy?
Narcissistic personality disorder - Self-centered and leave others behind.
Psychopathy – inability to adapt to social norms.
Borderline personality disorder – making it hard to understand and predict how others feel and will behave.
People with these disorders have lived in our population for millennia, some rising to the top of their chosen professions, some dying early. It is considered that the conditions have genetic and environmental components, in many cases with early childhood trauma.
In the case of AI’s – genetic is what we code them with. To date, most of the work has been about simulating emotions in order to get emotional creatures like us to be comfortable interacting with them. We have built fake empathy in AI’s so they can manipulate us into telling them things.
It could get a little weirder soon, the global consulting company McKinsey& Company are talking up the value of teaching AI to have actual emotions, not just simulate them.
In is not clear whether they mean just the “good” emotions or all of them. That should be an interesting exercise in software development since we don’t even understand how the work in humans.
Next time – what is childhood trauma if you are an AI?