One of my online friends wrote this week that A.I. could have been great if Spielberg had the balls to let go of his famous people-looking-up scenes and let an ending be more than Hollywood schmaltz.
But it HAD to be. Because the whole story is about the idea of love being something you can program into a machine. I think that is such a clear fallacy that Spielberg has recognised it too and worked on that principle. How would you go about programming love other than by inserting fairy tales as subroutines? That’s the simple mechanics of what kind of process would have to be installed.
Any programming of emotions would inevitably be limited by the ability of the programmer to express his own feelings. William Hurt’s character was seeking to replace his own lost son so clearly obsession would have been part of the emotional programming he gave the robots. Some languages have separate words for describing the expression of familial, parental, friendly, religious, erotic and emotional love. English has one word for all of those concepts. This is not insignificant.
AI as a story was bound to be a hopeless quest because it is built on the hopeless paradigm of modern biology–that human beings are not just equivalent to and understandable as machines–‘Human beings ARE machines’ shriek the thinkers of our age. Hell, even dictionaries like Miriam-Webster say it is so. Unfortunately, it is not so. We are not machines and emotional or familial or any other kind of love are not a series of programs that we run. No emotion is. I think it is extremely powerful and timely that AI shows this so well and it is a film that will be judged more prophetic in later years rather than lauded for being right in the present. That is an incredible strength.
This story follows from Spielberg clearly recognising (in this script at any rate) that human beings are fundamentally different from machines and exploring that. AI shows the dangers of losing our morality and emotionality and spirituality if we blindly follow science and logic as religions rather than seeing them as the tools they really are. David (what amazing acting!) will always be a machine, a tool. He could never be a person. Spielberg gets it and gives that to us in the narrative form he works with. Something like the Blue Fairy was inevitable because David wasn’t programmed to love in the human sense. He was programmed to fixate on one specific individual and to obsess over them on the basis of a series of subroutines built on fairy tales and fiction.
All the fairy tale references looked deliberate. They were too blatant to be otherwise. The Wizard of Oz with Gigolo Joe and Teddy as amalgams of the Tin Man, Scare Crow and Lion, there was an Emerald City and Dr Know as the Wizard, manipulated by William Hurt as the man behind the curtain. Hansel and Gretl being abandoned in the woods (and doesn’t that make you think about the cultural references we give our children?) It was all straight out of children’s books and film imagery–even Spielberg’s own from ET with the giant moon. However, they were there to show that stories do not equal real life, despite the efforts of writers or programmers. David’s biography is more about man’s inadequate attempts to pigeonhole our feelings than anything else.
Okay, you may say David was programmed with the capacity to learn. That is part of artificial intelligence. So maybe love could grow from there as opposed to fairy tale regurgitations. I think not, though. He would simply be trapped in his own programming because of the Oedipal fixation thing. What would he have learned from? His real life models were the family he was adopted by. He saw his frozen organic brother brought back to life (resurrected) and reunited with a loving mommy. The final act of the film became a reiteration of that experience. Spielberg shows us what computers do best–mindless repetition.
On the subject of the ending, I saw it is as a bit like a dream sequence. Maybe David’s mechanical mind had started to break down and random electronic pathways fired as his power ran down slowly under the ice. Perhaps this was his dream before it finally ended. Perhaps that was why it echoed so well those parts of his experience he equated with love. His own robotic saviours even turned up in a box! As it seemed so allegorical, I wasn’t worried about whether this part was really possible. It was a film and therefore all fiction, never more so than at this point.
David’s story exemplified that any endeavour which attempts to take poorly and misunderstood concepts from science and philosophy and then apply those in another area is doomed to a certain kind of failure although not necessarily without producing novelty and fascination. Meanwhile, I love the fact that this is a totally new exploration of these concepts breaking away from Blade Runner which was really dealing with replicants and cloning as opposed to mechanical intelligence. Both are built on that same human=machine paradigm but both develop from it in different ways.
For me, one of the biggest points of the film is also to ask that question ‘can a human being love a machine?’ and it’s the audience who ultimately must answer. On the basis of this story, I think the answer would be a resounding no. We can care, we can feel affection and fondness. We can enjoy their fairytale existence and story. But it isn’t love. And I think that’s a whole subtext that Spielberg wants us to get. I didn’t always like David as a character. He was somehow dead inside despite the cuteness, despite the simulation of life. I may have wanted it to be otherwise but it wasn’t. Yet that very spiritual deadness is somehow revealing and gave him great freedom to explore human boundaries.
Many of the questions that arise from AI are not answerable or even poseable given the level of current understanding in biological and behavioural sciences, computer technology and ultimately the paradoxes and weaknesses inherent in philosophy today. Unlike Ebert, I think Spielberg was wise not only to avoid providing simplistic answers to questions he couldn’t answer but to keep us thinking. What is love? What is intelligence? What is important and what is not? Can we care about and love that which we do not know?
Future history will judge this film as a result in part of the discussions and thoughts it will have provoked. It is flawlessly made. It is full of ambiguity and metaphorical characters. And it is one of the most ambitious explorations of philosophical ideas I think I’ve seen attempted in any film. I think it’s great that Spielberg has made a truly deep multi-levelled sci-fi movie. It is a tribute to Kubrick in the sense that it is meant to make the audience think beyond their experiences and I don’t think everyone will like that. Cinema audiences don’t usually enjoy being provoked in overly complex ways. I did.
Another author wrote: Just because you can’t figure out how the mechanical process works doesn’t mean you have to romanticize it. We are not machines – we are animals run by nerve impulses. We like to make more of these electrical impulses than is really necessary which is fine and well just as long as you realize that’s all they are.
But we are more. We are awash with hormones many of which are stimulated and some even produced (eg pheromones) by the environment around us. We are constantly hit by particles and energy from space. We sense deep mysteries and psychic phenomena through means not yet explained. But I accept your disagreement. I would suggest that most people are so far ingrained in the machine=human paradigm that they can’t step outside it any more than the church in Copernicus’ time could step outside the Earth-centric view of the universe. This is why I see this as a film which will be judged by history. AI captures a zeitgeist as surely as Andy Warhol’s soup cans, Michaelangelo’s David or prehistoric cave paintings.
Science fiction has explored other ways of seeing the universe, such as the nodes of Frank Herbert’s Whipping Star and the intelligence created from the intersection of waves of existence and communication relays in Orson Scott Card’s Ender’s Game series. These things say to me there are other ways of thinking about life even if we reject a completely religious viewpoint. History tells me science has never had a monopoly on the truth and is no more likely to with a Cartesian mechanistic view. AI reinforces this with its hopeless tale and lack of answers (which I still think is a good thing).
Above, I said, “AI shows the dangers of losing our morality and emotionality and spirituality if we blindly follow science and logic as religions rather than seeing them as the tools they really are.” Let me elaborate on that point.
Sure. If everyone is reduced to the level of an automaton, where their feelings are simply a string of electrical impulses along the lines of a computer program, then several things are likely to result. One is people generally feel alienated from each other. This hurts because it is not part of our nature and we wonder at it and feel frustrated but impotent because we become overly reliant on experts for a worldview. The dominant experts of the day tell us to reject spirituality in favor of science because science can test its hypothesis. In years gone by the experts would have been religious and would have told us to reject science. Thinking people always question for themselves which is why AI works at a provocative level.
In the science-dominated world of AI Gigolo Joe (and if ever Spielberg created a cult legend it is surely Gigolo Joe) satisfies a physical desire. But Joe could never fulfil a deep emotional relationship except for those who equate need with love. Gigolo Joe is fascinating because he has a huge capacity to learn and he constantly innovates. He embarks on what appear to be seemingly random and out of character actions. In the Dr Know booth he displays a real imaginative leap. Yet mostly his actions stem from his programming to show a capacity for caring and understanding. He simulates emotional attachments but cannot develop the kind of moral framework that would have resulted in the police arresting the real killer of his slain client. Nevertheless his simulated empathy is so good that we can empathize with him.
Empathy however suffers with a mechanistic view of life. People start treating each other as objects to be manipulated. Once everyone is an object, morality is a tool to be used or discarded rather than a definingly human characteristic. You can eventually launch other objects through the spinning blades and disolve objects in acid or tear them apart with hydraulics for entertainment with impunity. And those objects can resemble people so closely that to all intents and purposes they *are* people, at least as far as the audience’s emotional attachment is concerned. As long as they don’t speak or look or sound too attractive, as David did.
David’s begging for his existence swayed members of the Flesh Fair crowd but seriously would you keep your PC simply because it begged you not to throw it out? Programmed empathy is not the same as real feeling. I suggest that this only worked because the audience were already working from the philosophical premise that orga and mecha were equivalent on some levels. It is part of the man=machine zeitgeist. And if those are equivalent at some levels, then at what point all levels? The unquestioned philosophical values mean sometimes we ask why instead of why not.
Mechanistic explanations of behaviour are derived from worldviews where control of other people was paramount in the thinkers’ minds. It takes a while for philosophy to catch up with life and most of us always inherit a kind of socially-acquired philosophy that is centuries out of date. Even if it were only decades out of date, we aren’t *that* far from generations who thought of classes of people as little more than cannon-fodder for their wars and factory-fodder for their industries. Now we are shown David, a product designed for a specific application, a human need. Emotional-fodder. Yet on one level, AI shows that only other human beings really fulfil that human emotional need.
Another result of equating machines and people too closely is that we start forming emotional attachments to non-people–objects, tools. And we start romanticising them. One scene in which David shows he is a non-person is when he grips his human brother and pulls him down into the swimming pool then simply sits on the bottom. Of course only one of them will die but he doesn’t understand this or else he sees his obsession with Monica as above the value of other life. Again it comes down to empathy–the ability to put himself in another’s position, which is a result of not seeing anyone as different from an object.
David clearly lacks empathy although he simulates various aspects of it, such as assisting others and learning through imitation. This is part of his programming and again is a limitation of the designer operating without full knowledge. It is also a direct result of a designer/programmer creating emotion based on the assumption that he knows all about human beings because they can be reduced to a bucket of chemicals. A roboticist’s view is that orga and mecha are not only morally equal but can be analyzed in the same terms. Therefore mecha can be built to the same specifications as orga, notwithstanding the fact that orga comes without specifications. Again it derives from a mechanistic view of the universe.
Einstein said, “Imagination is more important than knowledge.” Yet still there is this feeling that we have come so far so fast that we must really understand everything in the world around us because we have so much knowledge. I don’t think that is any more true of emotional knowledge in particular than it is of AI generally and that acknowledgement of our lack of complete understand is part of why this film will fascinate for years to come.
Another friend of mine commented: The importance of the film was not really the issue. It may well be important in the context of film-making, and maybe there are such things in this film that will influence the making of future movies in a way that will benefit me, the humble movie-goer. I didnt think it was ‘important’ in a social or historical context any more than Terminator, and I would certainly never argue that was an important film.
So, importance to the film industry was not what I was looking for when I watched AI. I was looking for ‘watchable, interesting, engaging, moving, humorous, sad, entertaining etc.’ Any one of the above would have done, but I got none. If I got ‘important’ I was unaware of it, and could not have given a toss even I had been aware of it.
I’m not looking for importance to the film industry either. Films for film makers are generally pointless (mine being the exception, of course. Ahem). I do think this is an amazing story–part drama, part action–following a non-human’s doomed quest not to become human as so many other films have done but to discover love in his/its own terms. It is sad in a very tragic way and although it contains many beautiful images I see it as primarily a film to provoke thought. Definitely a departure for Spielberg but very much in keeping with Kubrick.
A.I. sets out to explore an old theme–can we make an artificial life-form that’s indistinguishable from ourselves? It’s the theme of Pinocchio and of Frankenstein as well as any robot story ever told. In fact, it’s also the theme of any science fiction encounter with aliens. Usually aliens are shown as people who not only speak our language and resemble us physically but who share our values.
David actually does speak our language and resembles us outwardly. However, A.I. is importantly different from other tales because it doesn’t slip into anthropomorphism. David is a non-person. His world is limited by a series of subroutines as he exhibits a programmed version of love which lacks real empathy. And that all means we can never empathise with David (or any of the robots) and, although this is the reason it alienates most film goers, it gives us a whole new perspective on intelligence, emotion and what it means to be human.
I suspect that my friend hated it precisely because it is about non-people. Films are usually most powerful when they’re about other people in situations similar to our own. A.I. is not just about someone in a situation outside our experience, it’s about a character who can never be a ‘someone’ in our terms.
Intelligence without real emotion, without empathy, is truly alien and with the exception of a few sporadic attempts in Star Trek we’ve never really seen it done like this before. Even in Trek, the rock creature that burned its message on the floor was just trying to protect its children, the Borg resorted to having a queen to give viewers context and the Q is a playful imp. They are all reflections of ourselves. A.I. is a paradigm shift for storytellers and its echoes will eventually reverberate through our culture just as Mary Shelley’s seminal tome became a standard before it.
To start with, both psychologists and computer programmers will have to face the consequences of a world where human beings cannot be fully understood through human-machine/brain-computer metaphors. A.I. shows that the illusion has limits. That’s not all, because a large chunk of our philosophy–and therefore our culture–is based on the man-machine metaphor. It goes back to Descartes and we’ve reached a stage in our scientific and technical development where we’re beginning to realise we’ll have to think again. A.I. is the first visible step on that rethink.