There’s often a dull uncle at Christmas lunch. Ordinarily, he’s to be avoided because he’s so boring. But sometimes, he’s the only one who’ll sit there while you unload a year’s worth of petty gripes and frustrations.
This week, the International Space Station (ISS) welcomed such an uncle – in the form of a floating robot with a benignly smiling face
This is the second – upgraded – “friendly robot assistant” with artificial intelligence technology to undertake a mission on the ISS to help astronauts with daily tasks, and to provide an artificial someone to talk to while so far away from home.
The robot, Cimon 2, is part of an evolving class of AI platforms that don’t have feelings of their own, but are rather designed to recognise and respond to the feelings of humans. Think of empathy that goes only one way – the kind of relationship that old-school men wanted from their wives.
At first the robot had attitude
But as it goes with humans, these sorts of relationships tend to hit rebellious or resistant snags.
According to IBM – which developed the question-answer capabilities via its widely-adopted Watson AI technology – the original Cimon (Simon is his informal name), which was trialled on the ISS in 2018, had a personality disorder that required psychological intervention.
As IBM explains on its website, the prototype “was having a bit of an identity crisis before psychology student Sophie Richter-Mendau stepped in”.
The programmers had designed conversation responses “that ranged in tenor from friendly to downright snappish”.
Cimon could respond to “How are you?” with “I’m sick of you. Just kidding,” or get offended when asked his age.
Ms Richter-Mendau realised that Cimon “needed a clearly defined personality – one that would fit his mission as an assistant and as a companion: a serious, helpful partner during work, but a friendly conversationalist during down time.”
He was just kidding
With this in mind, Ms Richter-Mendau consulted the Myers-Briggs personality classifications and came across the description for ISTJ: an introverted, sensing, thinking, and judging individual.
All of which was required of the robot, but Ms Richter-Mendau recognised that Cimon also had a sense of humour.
IBM says that Cimon has been trained to recognise emotional cues in speech like “unhappy” or “excited” – and that he can hold conversations informed by feelings. He can respond to “I miss my family”— not just with “I’m sorry,” but with “I’m sorry, how can I help?”
Cimon 2, the one commonly called “Simon”, is more “empathetic” than the previous model, according to a joint statement from IBM, Airbus and the German Aerospace Center (DLR), the groups that helped develop it.
The robot’s other function is to provide an objective opinion in situations where the crew are showing signs of “group think” – a potentially dangerous phenomenon that groups in isolation can be prone to.
The risk of following the crowd
An example would be on a high mountain where a storm is coming on and the sensible thing to do would be to descend. But the history of mountaineering – especially on glamour peaks such as Everest and K2 – is littered with the bodies of people who all thought is was safe to proceed upward, simply because that’s what everyone was doing.
But, in such a threatening situation, could the robot manipulate the crew and gain control?
In a 2018 study, a robot built sympathy from human users by telling them it was afraid of the dark and even begged: “No! Please do not switch me off!”
When this happened, the human volunteers were likely to refuse to turn the bot off – as instructed – or were reluctant to do so.