Wednesday, October 7, 2009

Morality in the Termi-verse: Human Life is Sacred

If any thing is sacred the human body is sacred.
[info][add][mail]
Walt Whitman (1819 - 1892)
One of the advantages of genre fiction in general (and science fiction in particular) is that a writer can present otherwise dry, academic subjects in an interesting, entertaining way. Science fiction is particularly good at this since, as the mythology of our modern age, it can address modern concerns in addition to the classics such as Truth v. Beauty, Life v. Death and God v. Man. In particular, it works well for examining traditional values in the cold, clear light of modern scientific logic.
In Terminator: The Sarah Connor Chronicles, the writers have given us two different approaches to moral teaching: through faith and through logic. The former consists of James Ellison (a former FBI agent now working for a high-tech firm) and John Henry (an advanced AI program which uses the body of a Terminator to interact with the physical world). The latter is represented by John Connor (the future leader of the human resistance against the machines) and Cameron (a cyborg assassin who has been reprogrammed to protect and guide John). Both machines are motivated to learn; John Henry was designed as a machine learning platform and Cameron, originally created as an infiltrator to kill humans, has fundamental programming to learn about human nature in order to better fulfill her function. Ellison has been previously shown to be a deeply spiritual and religious man as well as a top-notch investigator. His revelation that robots from the future were being sent back in time to secure the ultimate genocide of humanity didn't damage his faith, but rather strengthened it as well as his determination to fight back any way he could. John Connor is by virtue of his upbringing more pragmatic, with a talent for science, and sees his fight against the machine as a game of chess, with logic fighting logic.
In the episode "Strange Things Happen at the One-Two Point", a decision made by John Henry (then called Babylon) results in the death of a human (Dr. Boyd Sherman), Ellison is called in by his CEO (Catherine Weaver) to find out if it was an accident or murder.
Ellison: All we know are the facts: means, opportunity. But intent -- not even Mr. Murch can guess.
Weaver: Well, why don't you ask him? Ask the Babylon AI. Ask John Henry.
Ellison: John Henry? It has a name?
Weaver: That was something Dr. Sherman did. Gave it that name.
Ellison: It's not a person.
Weaver: No, but it's a mind. Talk to John Henry.
Ellison interrogates John Henry (who at this point can only communicate with images projected onto a monitor) to determine its 'state of mind' about the death of Dr. Sherman.
Ellison: It has no feeling for what it did. It has no opinion. That's what it's telling you. Sure, you taught it procedures. You taught it rules. But it's got no ethics, no morals. Whether it had any feelings about Dr. Sherman shouldn't matter, if you had taught it to value Sherman's life. Someone killed the man, and it wasn't John Henry. Excuse me... (he turns to leave)
Weaver: What would you teach it?
(Ellison turns to face her.)
Weaver: What would you teach it if you could?
Ellison: You want to teach it commands? Start with the first ten.
John Connor, on the other hand, takes a different approach to educating Cameron about human values. While the two are on a stakeout (in "Complications"), Cameron takes an opportunity to improve her knowledge:
Cameron: There are many things I don't understand.
John: Like what?
Cameron: The tortoise.
John: What tortoise?
Cameron: It was on its back on the side of the road in Mexico. Your mother turned it over.
John: She was helping it.
Cameron: I know, but why?
John: Because that's what we do. When we, uh, see something that's, uh, in pain or in trouble or whatever, we try to help it.
Cameron: Empathy.
John: Something like that.
Cameron: But not everyone would turn the tortoise over.
John: No. Some would just leave it there.
Cameron: Somebody would probably drive over it and crush it.
John: Yeah, I guess they would. Is that what you'd do?
Cameron: It didn't seem like much of a threat. We're not built to be cruel.
John: Yeah, that's one for cyborgs.
Cameron: Yes. That's one for us.
So here are two different approaches to moral teachings. Ellison starts with the Ten Commandments, rooted deep in Judeo-Christian tradition with all of the historical, cultural and religious context that this implies.
(NOTE: It is not my intention to bash religion as a source of moral teaching.)
John Connor, on the other hand, explains that acts of kindness towards the helpless are a result of empathy and are just something humans do. He's even honest enough to admit that not all humans possess this quality. Cameron's response that terminators aren't "built to be cruel" is interesting, though. Cruelty as well as kindness are driven by emotions but the implication of her comment is that cruelty could have been part of her programming. We have seen that Cameron doesn't hesitate to kill if she calculates that her victim is a threat to her mission or to the Connors.
Later, when John Henry is connected to a salvaged Terminator body so that he can extend his learning into the physical world and more directly interact with humans, Ellison begins his instruction. In this scene (from "Earthlings Welome Here") he is playing chess with John Henry. (John Henry was originally designed as an AI learning platform, using chess as a demonstration of its abilities.)
Ellison: Who taught you chess?
John Henry: I did.
Ellison: Did you play with Dr. Sherman?
John Henry: No. We played other games. Talking games.
Ellison: Do you miss Dr. Sherman?
John Henry: I'm designed to learn. He helped me learn. His absence slows my growth.
Ellison: His 'absence' is more important than that. His value was more than just his function for you. Human beings aren't like chess pieces. It matters if we live or die.
John Henry: Why does it matter? All humans die eventually.
Ellison: Yes, that's true. But our lives are...sacred. Do you know what sacred means?
John Henry: Holy. Worthy of respect. Venerable.
Ellison: Do you know why human life is sacred?
John Henry: Because so few humans are alive compared to the number that are dead?
Ellison: No. Because we're God's creation. God made everything -- the stars, the Earth, everything on this planet. We are all God's children.
John Henry: Am I God's child?
Ellison: That's one of the things we're here to talk about.
John Henry (makes a move): Checkmate. I win. Would you like to play again?
Ellison is trying to see if John Henry can move beyond empirical knowledge and logic and accept a set of rules that ultimately rest on a foundation that defies independent analysis. John Henry is expected to accept that a given action is right (correct) because it is righteous (proceeding from accepted standards of morality or justice). Ellison is a religious man.
The Connors, on the other hand, aren't shown to follow any spiritual tradition. For example, when they are hiding in a storefront chapel, Cameron is observing the crucifix when Sarah enters:
Cameron: Do you believe in the Resurrection?
Sarah: What?
Cameron: The story of Jesus Christ, the Resurrection. Do you believe in it?
Sarah: Would you, if you'd seen what I've seen?
Cameron: Faith isn't part of my programming.
Sarah: Yeah, well, I'm not sure it's a part of mine either.
When you're fighting a clandestine war with time-traveling, genocidal machines, faith is a luxury you can't afford. Or is it? Like Ellison, the Connors (and by extension the human resistance) do have one underlying article of belief: human life is sacred. But not for the reasons that Ellison gives to John Henry.
An illustration of this is in a scene from the episode "Today is the Day, Pt. 2" where John is talking to a resistance fighter who has come back in time to manipulate him in order to change her future.
You know, I've been running from the machines my whole life. They tried to kill my mom before I was even born. When I was twelve, they sent one after me. I was a kid. I was stupid. I didn't know what it was all about.
Both times, 'Future Me' sent someone back to stop them. The first time it was a soldier. His name was Kyle Reese. And he died saving my mother's life. The second time it was a machine. I used to wonder why I did that, why I took that chance.
I don't wonder any more.
Human beings can't be replaced. They can't be rebuilt. They die and they never come back.
Ironically, this mirrors John Henry's initial conclusion that human life is sacred because 'so few humans are alive compared to those who are dead'. For the straggling clusters of humanity that managed to survive the initial attack by the rogue computerized defense system Skynet (the attack is referred to as 'Judgment Day' in a reference to the Biblical apocalypse), human life is sacred simply because they are so few in number compared to the billions that died when Skynet launched its attack.
So the Connors arrive at morality (or more specifically a code of right action) by way of empiricism. Human beings are valuable because they are each unique and they can't be replaced or rebuilt when they die. Human life is worthy of respect or, according to John Henry, 'sacred'.
Both Cameron and John Henry are unique as well, each having grown past their initial set of programmed functions. Neither can be replaced. (This is shown explicitly in "Born to Run" when the computer technician Murch explains that John Henry only exists as a specific collection of hardware and software. He even makes the analogy to 'body and soul'.) In the Connor's morality, both Cameron and John Henry have 'lives' that are 'sacred'. In the morality taught by Ellison, however, neither Cameron nor John Henry are deemed worthy, since they are the children of Man, not God.
The Terminator series (currently consisting of four films, a TV series and a number of books) has always had a strong, life-affirming moral undercurrent. However, it is in Terminator: The Sarah Connor Chronicles where this subtext fully becomes text and is deliberately and explicitly examined.