The Imitation Game: Computers need to get in touch with their emotions
It’s obvious, really – Alan Turing must have been autistic because he was a mathematical genius who didn’t like girls. Such sloppy, lazily formed assumptions – so often the preserve of Hollywood movies – meant that I almost didn’t go to see The Imitation Game, but I forced myself and was pleasantly surprised that, although it took some liberties with the facts, it did grippingly convey the significance of Bletchley Park to the war effort.
The movie’s major “economy with the truth” lay in excluding GPO engineer Tommy Flowers, who actually built the kit and wrestled with the wiring looms; in the film, Turing was portrayed as doing it alone.
“I read between the lines of The Imitation Game’s script to a deeper meaning the writer may or may not have intended.”
It doesn’t mention Asperger’s – it was unknown in Turing’s lifetime – but Benedict Cumberbatch’s depiction of Turing is clearly based on modern notions about the stunting of emotional expression and social interaction that comprise that disorder. The plot relies upon Turing overcoming the dislike his coldness provokes in others, assisted by the token emotionally literate woman, played by Keira Knightley. The tragic ending shows Turing being chemically castrated by injections of female hormone, and that combination of emotions with hormones set me thinking: I read between the lines of The Imitation Game’s script to a deeper meaning the writer may or may not have intended.
The Turing Test
The film is named after a test of machine intelligence that Turing invented, in which the machine must try to imitate human conversation sufficiently well to fool another human being, on the assumption that language is the highest attribute of human reason. However, recent research in affective neuroscience – the study of how emotions interact with the brain – has revealed the extent to which reason and emotion are totally entangled in the human mind. The weakness of the whole AI project, of which Turing was a pioneer, lies in failing to recognise this, in its continuing attachment to 18th-century notions of rationalism.
Image: Enigma machine plugboard
Those parts of our brain that manipulate language and symbols aren’t in ultimate control, being more like our mind’s display than its CPU. I am, therefore I think, some of the time. US neuroscientist Jaak Panksepp has uncovered a collection of separate emotional operating systems in the brain’s limbic system, each employing a set of neurotransmitters and hormones. These monitor and modulate all our sensory inputs and behaviour, the most familiar examples being sexual arousal (testosterone and others), fight/flight (adrenaline) and maternal bonding (oxytocin), but there are at least four more and counting.
What’s more, it’s now clear that motivation itself is under the control of the dopamine reward system: we can’t do anything without it, and its failure leads to parkinsonism and worse. Now add to this the findings of Antonio Damasio, who claims all our memories are tagged with the emotional state that prevailed at the time they were recorded, and that our reasoning abilities employ these tags as weightings when making all decisions.
“Rationalist artificial intelligence is doomed to fail because the meaning of human discourse is permeated with emotion.”
These lines of study suggest that, first, all rationalist AI is doomed to fail because the meaning of human discourse is permeated with emotion (if you think about it, that’s why we had to invent computer languages); and second, AI-based robots will never become wholly convincing until they mimic not only our symbolic reasoning system but also our hormone-based emotional ones.
Sci-fi authors have known this forever, hence their invention of biological androids such as those in Blade Runner, with real bodies that mean they have something at stake: avoiding death, finding dinner and choosing a mate. Steven Hawking’s recent warnings about AI dooming our species should be tempered by these considerations: however “smart” machines get at moving, calculating and manipulating, their actual goals still have to be set by humans, and it’s those humans we need to worry about.
So the two big lessons I took away from The Imitation Game were these: machines will never be truly intelligent until they can feel as well as think, which relies as much on advances in biology as solid-state physics and software engineering; and it would be nice if they were to start planning “Imitation Game 2: The Tommy Flowers Story”.
Image: Tommy Harold Flowers