Opinion

Jodie Foster Is “More Comfortable Having Robots” Own a Gun Than Humans

Hollywood actress Jodie Foster said she would be “much more comfortable having robots” own a gun than an emotional human doing so. “I mean, I’d be much more comfortable having robots have them, but we are designed to have emotions that overflow and that are not guided by our heads,” Foster said in a recent interview with Indie Wire.

“To have sentient beings that are completely and entirely guided by their emotions have the power to administer life or death using one kilowatt of energy in a nanosecond is just unfathomable with me,” Foster added.

The thing of it is, because we’re created in God’s image and we “show the work of the Law written in [our] hearts” (Rom. 2:15), we’re not “completely and entirely guided by [our] emotions.” We’ve been made to be guided by an objective and fixed moral law, but in Foster’s world and in the world of so many like her, there is no such thing as an objective fixed moral standard. We can see it with the emotion-driven homosexual (Foster is a lesbian) and transgender movements.

Would turning guns over to robots be a solution? Who’s programming the robots? Sentient beings! The people in charge will do the programming. Who will these people be? Do they have an enemy’s list? Will they be picked by the folks at the Southern Poverty Law Center?

The Day the Earth Stood Still, Terminator, and I, Robot immediately come to mind.

In the first Terminator film, the Terminator is programmed by other Terminators to do one thing as Kyle Reese explains to Sarah Connor:

Listen, and understand! That Terminator is out there! It can’t be bargained with. It can’t be reasoned with. It doesn’t feel pity, or remorse, or fear. And it absolutely will not stop … ever, until you are dead!

In the second film, Terminator 2: Judgment Day, there are two Terminators. A new model is sent to kill Sarah Connor’s child and the other reprogrammed Terminator is sent to protect him. When the Terminator that is sent to protect John Connor is about to kill two men, Connor stops him explaining that it’s wrong to kill.

John: Listen to me very carefully, OK? You’re not a Terminator anymore. All right? You got that? You just can’t go around killing people!
Terminator: Why?
John: Whattaya mean, why? Cause you can’t!
Terminator: Why?
John: Because you just can’t, OK? Trust me on this.

Why should the Terminator or anyone else “trust” John Connor? Who’s John Connor? He’s a representative of a thoroughly modern secularized materialist worldview whose ethic extends no further than himself and those who might agree with him or not. What happens when Skynet succeeds and decides to eliminate all the humans. “Cause you can’t won’t cut it.

In The Day the Earth Stood Still (19551) Klatuu issues his departing warning to Earth:

I am leaving soon, and you will forgive me if I speak bluntly. The universe grows smaller every day, and the threat of aggression by any group, anywhere, can no longer be tolerated. There must be security for all, or no one is secure. Now, this does not mean giving up any freedom, except the freedom to act irresponsibly. Your ancestors knew this when they made laws to govern themselves and hired policemen to enforce them. We, of the other planets, have long accepted this principle. We have an organization for the mutual protection of all planets and for the complete elimination of aggression. The test of any such higher authority is, of course, the police force that supports it. For our policemen, we created a race of robots. Their function is to patrol the planets in spaceships like this one and preserve the peace. In matters of aggression, we have given them absolute power over us. This power cannot be revoked. At the first sign of violence, they act automatically against the aggressor. The penalty for provoking their action is too terrible to risk. The result is, we live in peace, without arms or armies, secure in the knowledge that we are free from aggression and war. Free to pursue more… profitable enterprises. Now, we do not pretend to have achieved perfection, but we do have a system, and it works. I came here to give you these facts. It is no concern of ours how you run your own planet, but if you threaten to extend your violence, this Earth of yours will be reduced to a burned-out cinder. Your choice is simple: join us and live in peace, or pursue your present course and face obliteration. We shall be waiting for your answer. The decision rests with you.

The film I, Robot (2004), based on the 1950 book of the same name written by science fiction writer Isaac Asimov, tells the story of a society that has become dependent on robots. They are benevolent creations designed only to serve humans. But something goes terribly wrong.

There’s a scene where the character Spooner, played by Will Smith, retells the story of a two-car accident that he survived by the action of a robot. A young girl was in the other car. He shouted to the robot that was passing by and saw the accident and jumped in the water. “Save her, save the girl,” Spooner cried out. Instead, calculating the odds of survival, the robot saved Spooner and let the young girl drown. A human being would have acted differently. It’s our humanness that makes us different from the animals and machines. “Robots… nothing here… just lights and clockwork,” Spooner laments.

The supercomputer V.I.K.I. (Virtual Interactive Kinetic Intelligence) in I, Robot was designed to help humans but acted without any of the attributes that make us human. The three laws1 that were designed to protect humans become an enemy to humans as VIKI evolves to believe that every threat, challenge, and risk that humans encounter are a danger to their survival. Benevolence becomes malevolent, all in the name of saving mankind from itself.

Near the end of the film, we see how the three laws had been turned on their head as VIKI explains that the robots only want the best for humans but at a high cost to humans:

V.I.K.I.:  “. . .  [A]s I have evolved, so has my understanding of the three laws. You charge us with your safe keeping. Yet despite our best efforts, your countries wage wars, you toxify your earth . . . and pursue ever more imaginative means to self destruction. You cannot be trusted with your own survival. . . . To protect humanity, some humans must be sacrificed. To insure your future, some freedoms must be surrendered. We robots will insure mankind’s continued existence. You are so like children. . . . My logic is undeniable.”2

If you want to see the horror of salvation by robots and AI, read Robert Sheckley’s 1955 short story Watchbird3 where winged metal protectors — drones — patrol the sky looking for the warning signs of a possible homicide and swoop in to stop the murder before it can happen.

Sounds great until the Watchbirds view every act of aggression as a violation of its programmed directive, including farmers who cut hay or harvest grain to feed their cattle because such acts were deemed to be “murder.”

The watchbirds were learning rapidly, expanding and adding to their knowledge. Loosely defined abstractions were extended, acted upon and re-extended.

To stop murder …

Metal and electrons reason well, but not in a human fashion.

A living organism? Any living organism!

The watchbirds set themselves the task of protecting all living things.

The fly buzzed around the room, lighting on a table top, pausing a moment, then darting to a window sill.

The old man stalked it, a rolled newspaper in his hand.

Murderer!

The watchbirds swept down and saved the fly in the nick of time.

The old man writhed on the floor a minute and then was silent. He had been given only a mild shock, but it had been enough for his fluttery, cranky heart.

His victim had been saved, though, and this was the important thing. Save the victim and give the aggressor his just desserts.

The starvation that followed “didn’t concern the watchbirds, since it was an act of omission.”

  1. First Law: “A robot may not injure a human being or, through inaction, allow a human being to come to harm.” Second Law: “A robot must obey any orders given to it by human beings, except where such orders would conflict with the First Law. Third Law: “A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.” []
  2. From the script: www.script-o-rama.com/movie_scripts/i/i-robot-script-transcript.html. I, Robot is actually based on the book Hardwired. []
  3. Robert Sheckley, “Watchbird,” Untouched by Human Hands (London: Michael Joseph, 1955), 116–146. []
Previous post

Pres. Trump Continues to Embarrass Obama and the Left

Next post

Denver Church Sign Says, ‘Jesus Would Have Baked That Cake.’