[Robert J. Sawyer] Science Fiction Writer
ROBERT J. SAWYER
Hugo and Nebula Winner


SFWRITER.COM > Futurism > Is Technological Advancement Worth the Risks?

Response to Bill Joy

Is Technological Advancement
Worth the Risks?

Copyright © 2000 by Robert J. Sawyer
All Rights Reserved


First published in The Globe and Mail: Canada's National Newspaper, Thursday, March 16, 2000.

A translation of this article into Romanian is available here.


Those who pooh-pooh William Shatner's acting should see his soliloquy from the Star Trek episode "Return to Tomorrow." Aliens offer the crew of the Enterprise fantastic advances in technology in exchange for letting them inhabit the bodies of three crew members for a few days.

Dr. McCoy, the Luddite, points out the downsides, but Captain Kirk wins him over with his eloquence: "Risk is our business," he says after enumerating advances science has already made by throwing caution to the wind. "That's what this starship is all about; that's why we're aboard her."

Shatner is so terrific, actually, that one forgets that the owners of the three borrowed bodies almost end up killed, one of the aliens commits murder, two die by suicide, and no scientific wonders are ever bestowed.

Despite this, we're left thinking that Kirk was nonetheless right to push for the advancement of science, the risks be damned. Anything less would be a betrayal of the human spirit.

These days, we don't have to look to aliens to provide technologies indistinguishable from magic. Such powers are now within our own grasp, apples of new knowledge seemingly ripe for the plucking. But was Kirk right? Is taking risks for the mere possibility of advancement worth it?

Bill Joy confesses to have grown up watching Captain Kirk and reading science fiction. And, like many who did so, Joy has gone on to a technological career. He is chief scientist at Sun Microsystems, a giant Silicon Valley firm.

This week, in Wired magazine, he published an 11,000-word manifesto entitled "Why the Future Doesn't Need Us" that, distilled to its essence, repeats the mantra of much 1950s science fiction: "There are some things Man was not meant to know."

Joy is worried about three nascent technologies: artificial intelligence (AI), genetic engineering, and nanotechnology. Is he right to be afraid of them? And, even if he is, is there anything we can do to reduce the risks?

His concern about AI is simple: if we make machines that are more intelligent than we are, why on earth would they want to be our slaves?

In this, I believe he is absolutely right: thinking computers pose a real threat to the continued survival of our species. Many AI experts — including Hans Moravec, founder of the world's largest robotics lab, at Carnegie Mellon University — believe that humanity's job is to manufacture its own successors.

Sure, Moravec says, we may shed a tear for some ineffable biological qualities that might be lost, but in the end Homo sapiens will be supplanted by machines. Since that's inevitable, he feels, we might as well go along doing the research that will lead to this.

Joy says no: we can, and perhaps should, put on the brakes. I agree.

Intelligence is an emergent property of complex systems; it arises spontaneously if conditions are right. Anatomically modern humans first appeared 100,000 years ago, but they were unencumbered by art, culture, religion, or abstract thought for 60,000 years.

Then, with no apparent physical change in their brains, consciousness emerged. Suddenly, these same people were painting caves, developing religious rituals, and more.

[Wired] The emergence of computer-based consciousness may happen the same way: arising spontaneously out of something complex we built, perhaps for another purpose (World Wide Web, anyone?).

It's not a new idea; Arthur C. Clarke first put it forward almost forty years ago in his story "Dial F for Frankenstein."

Other science-fiction authors have sounded this warning bell. William Gibson's 1984 novel Neuromancer features an organization called Turing whose job is to prevent the emergence of AI. And in my own 1998 Factoring Humanity, a thinking computer created at the University of Toronto commits suicide rather than risk turning against its human father.

I'm less concerned, though, about Joy's other two bugbears: genetic engineering and nanotechnology. Both, really, are forms of manipulation at the submolecular level: genetic engineering rearranges the atoms in a string of DNA so that a modified lifeform is produced.

And nanotechnology simply takes that a step further, proposing that we soon will be able to tear down and rebuild any molecules we want, turning, for instance, a pile of bricks into a mound of gold, or a giant three-cheese lasagna, or anything else.

Joy's fear is that genetic engineering will be used to create diseases that target specific ethnicities. An Arab and an Israeli don't just differ politically; they differ genetically, too, and Joy fears it will soon be easy enough to produce a virus that will wipe out only one or the other.

Possible? Yes. But, then, so is a plague that affects only those humans with genes for antisocial behavior (first-order sorting: check for a Y chromosome); you can bet some self-styled Good Samaritan will release something like this, as well.

But, despite such scenarios, I find it unconscionable to tell a boy with leukemia or a woman with diabetes that we're not going to do any more genetic research. The cures for diseases — including the one known as aging that gets us all if nothing else does — will come only from manipulating DNA.

Joy also thinks we should have a moratorium on nanotechnology, since a nanotech machine can produce anything — including copies of itself — from whatever raw materials are at hand.

He writes, "An immediate consequence of the Faustian bargain in obtaining the great power of nanotechnology is that we run a grave risk — the risk that we might destroy the biosphere on which all life depends." Indeed, if just one little self-replicating doodad that turns water into wine escapes, we might see it and its spawn destroy our ecosystem, and us along with it.

But nanotechnology will also allow us to provide for all the material needs of the entire human race: as much clear air, water, food, clothing, shelter, medicine, and entertainment as anyone could ever want.

It will be impossible to keep this technology from the masses: just one microscopic machine that can convert raw materials into other forms is all that has to be smuggled out of the lab.

Soon, everyone will have a replicator, and the economic reasons for war, oppression, and figurative and literal slavery will disappear. Supply will always equal demand in everything from basic essentials to elaborate equipment, costs will be zero, and poverty will vanish.

Captain Kirk said, "Risk is our business." I don't think so; I think improving the human condition is our business. Other minds — silicon consciousnesses — won't share that mission statement, and are rightly to be avoided. But genetic engineering and nanotechnology will allow us to so vastly improve humanity's lot that we'd be fools to turn our backs on them — despite the risks.


[2000 bionote] Robert J. Sawyer's latest science-fiction novel is Calculating God. He frequently appears on Discovery Channel Canada talking about the future.

[2015 bionote] Robert J. Sawyer has won the Hugo, Nebula, and John W. Campbell Memorial Awards, all for best science-fiction novel of the year; the ABC TV series FlashForward was based on his Aurora Award-winning novel of the same name.


More Good Reading

Rob's keynote address on AI and science fiction

More thougts about AI

Rob's op-ed piece on Stephen Hawking's call to colonize space

Rob's op-ed piece on Michael Crichton blending fact and fiction

Rob's op-ed piece on the private sector in space

More Futurism articles


My Very Occasional Newsletter


Home
Novels
About Rob
Book Clubs
Blog
Events
Keynotes
Press Kit
How to Write
Facebook
Store
Nonfiction
Email Rob
Canadian SF
Patreon


HOME • [Menu]MENU • TOP

[Patreon][Facebook][Twitter]

Copyright © 1995-2024 by Robert J. Sawyer.

["Trilobot