I was as skeptical as anyone before walking into Transcendence. I figured I knew exactly where it was going to go: man (Johnny Depp) is uploaded into a computer, wife (Rebecca Hall) believes it’s still him to a damaging degree, computer man gets drunk on power, harms humanity and wife is eventually forced to pull the plug. And, if you look at it simply on the surface level, that’s exactly what Transcendence is and the filmmaking and love story is largely detrimental to the overall quality, but there is so much more to grasp onto than this film is given credit for.
Whether offering ideas and discussion points makes a movie worth seeing I won’t argue, but I felt a need to lay out exactly what I took away from Transcendence in hopes it might make people look at it a little differently or at least give it a chance before it’s out of theaters forever.
Hear me out…
What is the next step in technological/human evolution?
This, for me, is the first question screenwriter Jack Paglen set out to answer with Transcendence. Johnny Depp plays scientist and Artificial Intelligence expert, Will Caster and he’s already developed P.I.N.N., an advanced form of artificial intelligence that would eventually become the basis for uploading Will’s consciousness into a computer.
Obviously this is nothing new and the questions that come with it aren’t new. If we upload our consciousness do we lose our humanity? What makes us human? The idea here being the need of a computer is different than the needs of a human, biologically that is… However, Are the needs really all that different?
Humans need and absorb energy for survival and the best of us crave knowledge and stimulation. As soon as Will is uploaded into the computer he says he needs more power, that being because the computer he was uploaded into isn’t enough to sustain his consciousness. It’s the human equivalency of starvation, and the problem Transcendence immediately faces is we’ve seen this scenario before, but only in movies where the digital being is lusting for power, not out of need, but want. There is a major difference in consequence between these two motivations.
Gaining the power he needs, Will begins to grow, he gains the knowledge to play the stock market, make Evelyn (Rebecca Hall) enough money to move into the desert and build a complex big enough and with enough “quantum processors” to satisfy whatever it is he is becoming. In fact, what is Will becoming? Better yet, what is Will?
Will’s evolution to a technological singularity is that next step
The idea of a technological singularity isn’t new as you can browse over to Wikipedia and find several examples of it being explored in popular culture. However, that isn’t to say Transcendence hasn’t successfully found a new way of presenting the idea.
The biggest difference, as far as I can tell, between previous iterations of artificial intelligence and what is done in Transcendence is to come up with reason for an answer behind the accusation that robots/computers will never be human. Which is to then ask the obvious…
What does it mean to be human?
I took this question to the Google machine and found an interesting response in the forums at TED.com from a commenter named Johnny Atman:
While you unpack that, let’s look at it a little more scientifically (not that I’m a scientist, I’m working from the seat of my pants here). The human body is made up of six elements, those being oxygen, carbon, hydrogen, nitrogen, calcium and phosphorus. I have to ask, is anything else made up of these elements? Calcium? Is this to say humans have something in common with milk? Phosphorus is used to make explosives and the oxidation of elemental phosphorus will cause light. Does this mean humans have something in common with light?
Keep those questions in mind as we move on…
Getting back to the movie, or a related movie in this case, I think Transcendence builds on ideas present in Spike Jonze‘s Her. The question of how could Joaquin Phoenix‘s character fall in love with a digital being? Especially a digital being fabricated out of nothing and having to form new experiences. In this sense, Will Caster has transcended Her‘s Samantha and this back-and-forth from Her contributes to that idea:
To look at it this way is to suggest two things, memories aren’t feelings, in fact there may not be such a thing as feelings as much as there are programmed ways to react to certain stories and instances. In the case of Transcendence, digital Will is programmed with these memories and stories as well as his reaction to them, therefore informing future reactions, such as love and compassion.
Machines can’t feel compassion?
This is always what these kinds of stories come back to, Can a machine feel love? Can it feel compassion? In the new RoboCop remake it was the question of whether a robot cop could make moral decisions or was the first instinct to simply kill all threats? Here is where I knew I was digging Transcendence because it answers this scenario in such an interesting fashion.
First off, why wouldn’t a “machine” (a word I use with hesitation as it seems limiting in this case) programmed with a human consciousness “feel” love or compassion? To answer the question let’s consider the simple human instinct for survival, something the device you are reading this post on has built in. AntiVirus, defragmenting software, system health, software updates, etc. are all built into our devices to ensure they don’t get bogged down or stop working, this is a machine defending itself in the same way humans utilize medicine, exercise and good eating habits.
So a machine has the instinct for survival, just as humans do. Is this an emotion, human nature or a programmed response? Does it differ based on what we are referring to?
In the case of human nature, Transcendence would seem to remove the “human” element and just call it nature. Digital Will creates nano-particles that become part of, and are made from the elements that make up, nature. Not only do they have the ability to regenerate and improve humans, but they can purify the air, water, trees, etc. Will has not transcended humanity, he has transcended the idea of humanity and the idea humans are any different than anything else on Earth. The only difference, in this instance, between humans and everything else on Earth is their molecular composition.
So what is to come of humanity?
This is the biggest question in the entire film and it’s the toughest to answer because this is where spirituality comes into play. Beyond the question of whether computers can feel compassion and love or know right from wrong, people will wonder (as does Transcendence) What of the human soul?
I have to ask, Do you believe humans have souls? If you do, then you will view Will’s ability to transcend the idea of humanity (because that’s exactly what humanity in this scenario is) as an abomination. You’d see the coming together of all of nature as the destruction of that soul.
But what is a soul? Is it as simple as saying it’s the one part of a human that isn’t physical? In that sense is it really something that could be lost and/or destroyed?
Catholics believe the soul to be the subject of human consciousness and freedom, which then opens up a whole new can of worms and I’m sure you could find a variety of answers from all religions and if you’ve seen 21 Grams or Seven Pounds you may even have theories as to how much it weighs.
If you believe a soul is something that can’t be lost or destroyed, or just don’t believe humans have souls, we can look at humans in terms of genetic code. You could argue our consciousness, personalities and decision making are the result of that code and flaws therein. As a result, Will would be the solution to those flaws.
To me, Transcendence is an example where nature (of which the human race is a part of) has found a way to heal itself and, in a lot of ways, nature has created an AntiVirus to cure the Earth of the destruction caused by humanity.
I loved the moment at the end of the film where Will is given the chance to save Evelyn by uploading her, which would also kill him due to the virus she’s carrying. Will realizes this and in his decision (because it is a decision) to sacrifice himself, and save the one he loves, on one hand he shows compassion (something the humans battling him believe he is without) while also offering a “fine, keep your shitty world” ending.
It’s the conflict of humanity, we want to strive to be better, but if that were to mean sacrificing ourselves we would not only fight back, we’d be incapable of even realizing what we’re fighting against.
In the end
Will Caster served as the next stage in human evolution, while at the same time offering a solution to the destruction humans have caused the Earth. The pollution of our world’s oceans that became a larger reality after the recent search for the Malaysian airliner. A solution to air pollution and a solution to death.
The humans in the film feared this evolution even though it seems to be exactly what we as a society are striving toward. Because in this scenario we lose what we perceive to be our humanity, perhaps the only thing we won’t sacrifice, even though, as Will points out in the beginning, we are more than willing to sacrifice other humans in a fight for what we believe is right.
So, tell me, once we find that cure for cancer, Alzheimer’s is no longer a problem and we can grow and replace a human heart and other vital organs. What then? Will we revolt against ourselves? Can human advancements be stopped? How much advancement is too much?