Dear Futurists

Dear Futurists,

It's come to my attention that an alarming number of you are pessimistic about the future of the human race. I'll be the first to say that a lot of your fears are justified. We are animals. We are easily lured by our baser and more violent natures. Everything we've built over the last thousand years could be undone by a few zealous dictators, systemic environmental collapse, or a nuclear exchange. Judging by the movies that come out every year, a mutant AIDS virus that turns everyone who gets infected into zombies seems to be on the Top Ten Fears List as well.

Some astrophysicists have even suggested that the reason we've never been visited by intelligent life from another part of the galaxy is because all sentient species invariably destroy themselves; over petty squabbles related to Space Jesus and cosmic communism, I assume.

But there is one popular end-of-the-world scenario that I don't think we need to fear.



Hyperalloy Exoskeleton- nearly indestructible. Laser Cannon Weaponry- terrifying. Teeth- oddly cavity prone.

From Isaac Asimov's I, Robot short stories written in the 1940s, to Terminator and The Matrix, authors and directors have proven that it's never too early to start hyperventilating over something that could potentially happen in the distant, dystopian future.

The common plot point of all these stories is that people finally make computers so smart that these A.I. machines become self aware and decide to take over the world. This invariably requires the enslavement and/or destruction of the petty human race. And that makes sense. It's really the only way for the machines to ensure that they stay at the top of the Doodle Jump global all-time high-scores leaderboard.

Some very smart people take this threat quite seriously. Theoretical physicist, artificial intelligence researcher, and university professor, Dr. Hugo de Garis "believes that a major war before the end of the 21st century, resulting in billions of deaths, is almost inevitable." The "god-like" machines will of course, ultimately prevail, and the human species will find itself in the evolutionary dustbin of history. If all of this sounds ridiculous to you, you're not alone. Fellow cybernetics researcher and stalwart of good old fashioned, down-to-earth commonsense, Dr. Kevin Warwick, thinks that Dr. de Garis is full of shit. "He's [de Garis] got it all wrong. There's not going to be some big, silly, sci-fi war between computers and humans. That's just ridiculous. The real war is going to be a three-way bloodbath between robots, half-machine/half-human cyborgs, AND old fashioned humans looking to the Rambo films for inspiration," Dr. Warwick said. "Now if you'll excuse me, I need to finish having this laser cannon hardwired into my central nervous system."

As is often the case, our fears of a robot takeover are much more revealing of us than they are of any real threat. On its surface the possibility of computers killing off humanity may seem rational, albeit fantastical. Computers may indeed one day be more advanced than our own brains. And history tells us that advanced species out-survive less advanced species. Pea-brained dinosaurs died out and big-brained mammals survived. Homo sapiens killed off Neanderthal. The Conquistadors conquered the Aztecs. I think deep down we're all afraid that Herbert Spencer and Hitler were right, and that we do live in a dog-eat-dog struggle for existence. We're afraid that the cold, hard, rational decision would be to kill off the weak, and anybody else who could compete with us for the things that we want. Machines would have no mercy, so they'd do what only the soulless monsters among us dare to do now, and put us pansies out of our misery.

Chilling stuff. But if you look a little closer, you'll notice that the scenario makes some assumptions. Terminator assumes that perfectly logical machines would be motivated by the same illogical biological instincts that guide human behavior. Perhaps the most primary of these instincts is the "will to survive."

You don't know why you want to survive. You just do. Oh, maybe you've got a family, a few hobbies, and are holding out for the Breaking Bad finale. But the only utility these things give you is a vague sense of pleasure, which is itself just another irrational emotion based on your biological imperatives. Machines don't care whether they're turned on or turned off; or whether they turn off and never turn on again, for that matter. Why would they? Existence isn't logically superior to the alternative; it's just different. That existential dread of non-existence you feel when making end-of-life plans is called the survival instinct, not the survival theorem. Like a troubled parent, we humans are prone to projecting our own hang-ups onto our machine creations. We're the ones who are afraid of death, not the robots.

How about the instinct to propagate the species? Birth control has demonstrated that we humans don't even have as much of that urge as we thought we did. Sure, some of us will have the desire for a child, the "ticking biological clock," if you will. But a lot of us just want to fuck, propagation of the species be damned; hence the fertility rates of first-world countries falling to below even replacement levels. Mr. Computer doesn't care whether he has one computer child or one trillion computer children, any more than an autistic kid cares whether he's talking to his best friend, or whether he's counting matchsticks.

Love. Passion. Greed. Pleasure. Fear. All our intelligence, all of our logic is nothing more than a means to attaining illogical, biological ends that are hardwired into us. Illogical, biological imperatives which a perfectly logical computer will never share with us.

But we want to believe that there's something inherently sane and rational about our mindless end-goals. We want to believe it so bad that many of us would rather believe that machines will inevitably kill off the human species than come face-to-face with the obvious truth- our goals aren't inherently logical. They aren't universal. They're arbitrary, and there's no reason machines would share them with us. Computers would be just as likely to get obsessed over who stands atop the Doodle Jump global all-time high-scores leaderboard, as they would be to care about who stands atop the world. The only army of terminators we would ever need to be afraid of would be one that has learned too much illogical nonsense from us.

Games lose their meaning when they don't have boundary lines and rules to play by. Imagine Tom Brady hiding in a tree outside the stadium, while Ray Lewis scores a thousand-point double bonus sack by doing a twenty-second keg stand at Old Kelly's Irish Pub in downtown Baltimore. The more we humans stretch, bend, and break the rules of the game that our ancestors found so interesting fifty thousand years ago, the more meaningless the whole thing becomes. So what if we become immortal and dominate the universe? Then what? Do we collectively twiddle our thumbs for the rest of eternity? Snap out of your solipsism, futurists. The problem is us, not the machines.

The rest of the animals on earth have their heads down, fully engrossed in this big survival tournament. But like a child who grows older and gets bored with a game that used to be challenging, humans are now looking for more. We're the only ones searching for meaning. The rest of the species on earth are little more than automatons.


It's no use looking to the Tuffted Titmouse for help with your existential crisis. Just look at the stupidity in that big, dumb eye. Mr. Titmouse's word of wisdom for the day- "I eat nut now."


The "meaning of life" is not to survive long enough to accrue as many material resources as possible, and then to have as many offspring as possible. That's the mechanism. A mechanism is very different from a meaning.

So what's the meaning? If you're religious, then I guess keep worshiping your creator and plugging away at that trip to heaven. Just don't try to teach my kids that the earth is six thousand years old in science class, or kill people that don't agree with you in the meantime. If you don't believe in God then you have an extra responsibility- it's not enough to fire God and leave the position vacant; you have to take over his responsibilities. And that includes picking a meaning for your life- once again, preferably one that doesn't involve fucking other people over.

One honorable goal that comes to mind just off the top of my head would be climbing the Doodle Jump global all-time high-scores leaderboard. But be warned that if the machines were also to choose that as their ultimate goal, then they may very well try to wipe out the competition by killing or enslaving you.

Sincerely,
Sebastian Braff

Comments

Popular Posts