MY SISTER-IN-LAW was given Alexa for Christmas. Alexa is a small machine that sits in your house and listens in on your conversations. If you summon it by calling its name, Alexa will play music of your choice, provide information, give you the news and sports scores, tell you the weather, and control the devices in your home, such as the heating, lighting or burglar alarm. And, thanks to the Church of England’s impressive digital team, it will say a prayer, or tell you where your nearest church is (News, 25 May 2018).
We played with it all Boxing Day afternoon. “Alexa, choose some Christmas music for me.” “Alexa, what’s the weather like in Manchester?” “Alexa, tell us a joke.” And she — sorry, it — responded appropriately and accurately every time. Then, of course, I started to get clever. “Alexa, what’s the meaning of life?” There was a pause: “42”. Then I tried “Alexa, are you happy?” “I’m sorry, I don’t understand your question.” “Alexa, would you like to have a body?” “I’m sorry. I don’t understand. . .”
Alexa is just a domestic example of Artificial Intelligence (AI). The central tenet of AI is that we can view the human being as analogous to a machine — the body — that is operated by a computer — the brain. Now, suppose we could build a computer that would have as much processing power as a human brain, and that was a match for us in all sorts of areas of intellectual endeavour. It could store all of human knowledge in one place. It would be good at making decisions. It would never make mistakes or get tired or emotional. That could be extremely useful.
And suppose we could equip it with algorithms that would find patterns and generate insights from raw data. Once these algorithms had compiled a big enough library of causes and effects, it could use this information to make decisions and predictions when it found itself in a similar situation. It would have an artificial “instinct” that would be more predictable and accurate than ours. That machine would, in some senses, be equal — or even superior — to us.
MANY discussions in this area focus on advances in digital technology — and, of course, they are dizzying. The possibilities for Christian mission and communication are huge.
But, while we are rightly enthusiastic about the things that new technology can do for us, we should be very thoughtful about the much deeper changes that are happening in our age. It’s not what is happening in technology that is fundamentally important, but what is happening in our culture.
Jaron Lanier, a brilliant technologist who was responsible, among other things, for the development of the MP3 system for music transmission, gives a stark warning about the ways in which technology acts on us. Machines are appearing to be more and more “intelligent”, Mr Lanier says, because humans are choosing to abase themselves in front of them. Did Alexa really know the answers to our questions? Or were we playing along — playing dumb to make Alexa seem clever?
Christian theology speaks of human beings as particular within creation, both in relation to other creatures and also in relation to non-materiality. We are aware of our presence in space and time. We are named and gendered and embodied.
Yet how quickly we have bought into the idea that machines such as Alexa, Siri, or Pepper can have a name and a gender. How easily we have gone on to treat those pseudo-humans as household slaves. We talk about computers learning, or remembering, or deciding, or praying. Of course, they can do none of those things: all of those attributes are just analogies for what humans do. Ask Alexa “Do you love me?” and she — it — will have nothing to say. Of course it doesn’t love, because it doesn’t have the capacity to be loved, and all that goes with it: to be vulnerable, to feel pain, to long for a better future, to call out to God.
We should be careful not to give computers names or genders. A machine is always an “it”. We should not use verbs which suggest that machines can think or speak or choose. We have a theological category for what it means to make ourselves subject to objects we have made: idolatry.
WHILE the Church rightly addresses the missional opportunities of digital communications, we are allowing some fundamental challenges to the status of the human being to go almost unobserved. We run the risk of sleepwalking into accepting that personhood is just an attribute of humanity — something that we have, not something that we are — and that we can make a machine that has it, too.
If we do, we will soon find ourselves limiting our own words and choices to the ones that the computer can deal with. To paraphrase George Orwell, “All the creatures outside looked from machine to man, and from man to machine, and from machine to man again; but already it was impossible to say which was which.”
Andrew Graystone is a journalist and broadcaster, who is working on a Ph.D. on digital culture.
His book, Too Much Information? Ten essential questions for digital Christians by Andrew Graystone is published by Canterbury Press at £12.99 (Church Times Bookshop special price £10.39).
Listen to an interview with him on The Church Times Podcast.