Download the PDF here
Mathematician Lewis Carroll used Alice’s adventures in wonderland to introduce what may happen in the singularity of a black hole. Physicists still debate whether the singularity is a window to another universe, like Alice’s mirror, or a deadly trap. Although we are not endangered by black holes, we may soon face another type of singularity, one derived from an overwhelming scientific progress. Extrapolating from last century’s scientific progress, this century will be marked by unprecedented technological breakthroughs. As technologies allow for greater increases in scientific output, the result will be an explosion in scientific progress that will reshape human civilization. We may reach a point, a singularity, where humankind will undergo a deeper change than in the past 100,000 years. In addition to incalculable benefits, technology also yields weapons of increasing destruction. Soon, knowledge may be the only resource necessary to build weapons of mass destruction. Due to the both creative and destructive nature of the human mind, the dilemma is whether the technological singularity will be a bridge to wonderland or if it will mean the end of human civilization.
About 150 years ago, Lewis Carroll told the adventures of young Alice in a magical place called Wonderland [1,2]. Mathematician Charles Dodgson, Carroll’s real identity, used his children’s stories to introduce what we now call “wormholes”. The centre, the singularity of a black hole might be a “wormhole”, a link to other parts of the universe or even to different universes . The mirror Alice crosses to reach wonderland is a metaphor for what can happen if one attempts to reach the singularity of a black hole. Even today, the singularity of black holes remains a mystery for physicists. It can be a window to another universe, like Alice’s mirror, or a deadly trap that will consume with fire and radiation everything within its reach. Fortunately, we are not endangered by black holes. Yet we may be on the verge of another type of singularity, one derived from an explosion in scientific progress. Herein I argue that we are about to reach a point where humankind will undergo a deeper change than in the past 100,000 years. The dilemma is whether this singularity will be a bridge to wonderland or if it will mean the end of human civilization.
2. The window of technology
Unprecedented scientific achievements marked the 20th century: humans walked on the moon, medical advances led to increases in longevity of up to 50%; we benefit from much improved transport and communications, and we have only begun to profit from the computer and genetic revolutions . Yet the technologies of the past pale in comparison to what scientists expect to develop within the next few decades [5–8]. For example, if Moore’s Law holds and computers continue to evolve at the current pace, we can expect Artificial Intelligences (AI) superior to the human brain in the first half of this century [9,10]. Nanotechnology, if it delivers its promise to formulate molecular assemblers, will also revolutionize humankind . We are as if watching pictures of the future and, as at the dawn of the 20th century, even if these images are blurred, the envisioned developments will change the human condition forever.
Most, if not all, technological breakthroughs of the present are dependent on past breakthroughs. For example, the Human Genome Project would not exist without computers. Similarly, future developments will be dependent on each other. Nanotechnology will be impossible without faster computers and faster computers will probably need nanotechnology to evolve. In today’s information society, each scientific breakthrough represents a positive feedback loop. If the pace of scientific progress continues to increase, we may reach such a rate of technological progress that it will become impossible to predict the immediate future. We will reach a type of singularity: a period of such rapid progress that our values and ideas will be but drops in the singularity’s ocean of knowledge [6,12]. To quote Vernor Vinge: “A point where our old models are discarded and a new reality rules.” One, but not the only, possibility for a technological singularity is an AI superior to man’s. The presence of a superior intelligence would lead to scientific breakthroughs the human mind is incapable of coping with. In addition, an advanced AI would lead to new breakthroughs in shorter time scales that would result in even more intelligent AI. It would result in an explosion in scientific progress beyond imagination. It would reshape human civilization.
Although scientists [13,14], scientific papers , and patent applications  have been increasing worldwide for the past decades (see  for an overview); although the trend is for scientific breakthroughs to occur at an ever-increasing pace, it is impossible to be sure of any future event. Yet all it takes is the simple assumption that the technological progress of the past decades will continue during the 21st century to make the singularity inevitable, probably happening within the next few decades.
We are as if watching from a window the possibilities of the future, the promises and ambitions of Science; we are approaching a singular point in human history due to a massive increase in knowledge. Only in the same way technology can save lives and enrich our dreams, it can destroy lives and generate nightmares. The evolution of technology in the next century will decide whether or not the human mind is prepared to survive the universe. Are we ready to evolve as a species and spread our civilization or will we face extinction?
3. The mirror of the future
In 1945, when scientists detonated the first atomic bomb, they theorized that there was a slight possibility that the nuclear explosion would destroy planet earth . It was the first time a technological breakthrough placed humanity in danger. Yet as technology evolves, destructive and military applications follow the current. Wea- pons capable of destroying humankind have existed for decades. The problem is that as technology progresses and knowledge spreads, these weapons are becoming easier to build. Soon, information may be the only resource necessary to build weapons of mass destruction. In addition, single weapons of mass destruction are becoming more powerful. For instance, a cobalt bomb is a hydrogen bomb with a cobalt tamper: a single cobalt bomb can, if powerful enough, destroy civilization . Others have speculated on the dangers of future technologies, such as AI and nanotechnology [20,21]. Whether their fears are unfounded remains to be seen; yet the trend is for weapons of mass destruction to become more accessible to everyone. Military technology is ultimately democratic. As History teaches us from gunpowder to chemical warfare eventually all weapons become available to the determined general. For example, the US banned papers on nuclear technology in 1940, but today many nations possess nuclear arsenals. The window of technology is actually a mirror of the future because when we imagine future technologies, we must picture ourselves using these technologies, including advanced weapons of mass destruction. Unfortunately, for centuries, the ingenious human mind has also been a source of death, destruction, genocide, and much foolishness. Despite the increase in knowledge, we remain humans. To quote Albert Einstein: “The release of atom power has changed everything except our way of thinking.” The human mind is developing more powerful technologies, but it is the same human mind that caused the holocaust and countless other atrocities.
4. Alice’s dilemma
The dilemma we are to face within a reasonable timescale—according to Vernor Vinge, within the next 30 years is whether the technological singularity will open a glorious future for humankind or if it will destroy us. Can scientific progress improve the human condition and civilization beyond our dreams? Or will it just release more powerful weapons for the next generation of terrorists  and serial killers to employ in the destruction of civilization? Considering the large differences between nations, it is unlikely we can control or prohibit technology, a position defended by some [20,21]. If a singularity is to happen, we cannot stop it . Yet we cannot allow weapons of mass destruction to be widely disseminated, or we will face extinction. How can we guarantee that Science will lead us to wonderland and not hell? Unfortunately, this author has no solution to the dilemma. There have been some proposals: from a loss of privacy, transparent society scenario, to a despotic system, or even to a colonization of extraterrestrial planets; sadly, the solutions are “unsatisfactory” [23,24]. Hopefully, scientists and policy makers will be able to find a solution before, for example, a terrorist organization builds a cobalt bomb. A discussion of technological progress is necessary: not to ban or hinder Science, but to make sure humans are not just a blink at the eyes of the universe.
J.P. de Magalhães is funded by the FCT, Portugal. Thanks to everyone at the Extropian Institute for debating these issues. Further thanks to John Barry of University College, Cork, for opinions on the manuscript.
- L. Carroll, Alice’s Adventures in Wonderland, Macmillan, London, 1865.
- L. Carroll, Through the Looking-Glass and What Alice Found There, Macmillan, London, 1872.
- J.S. Al-Khalili, Black Holes, Wormholes and Time Machines, The Institute of Physics, London, 1999.
- G. Piel, The Age of Science: What Scientists Learned in the Twentieth Century, Basic Books, New York, 2001.
- J. Brockman (Ed.), The Next Fifty Years: Science in the First Half of the Twenty-First Century, Vintage Books, New York, 2002.
- D. Broderick, The Spike, Forge, New York, 2001.
- M. Kaku, Visions, Oxford paperbacks, New York, 1999.
- D. Mulhall, Our Molecular Future: How Nanotechnology, Robotics, Genetics, and Artificial Intelli- gence Will Transform Our World, Prometheus Books, New York, 2002.
- R. Kurzweil, The Age of Spiritual Machines, Texere Publishing, New York, 2001.
- H. Moravec, Mind Children, Harvard Univ. Press, Cambridge, 1990.
- E.K. Drexler, Engines of Creation, Anchor Press, New York, 1986.
- V. Vinge, The Coming Technological Singularity: How to Survive in the Post-Human Era. Paper presented at the NASA VISION-21 Symposium, Westlake, OH, 30–31 March, 1993. URL: http://www-cse.ucsd.edu/users/goguen/misc/singularity.html.
- UNESCO. World Science Report 1998, UNESCO, Paris, 1998.
- UNESCO. The State of Science and Technology in the World: 1996–1997. Quebec: The UNESCO Institute for Statistics, 2001. URL: http://www.uis.unesco.org/en/pub/doc/WS—report—2001.pdf.
- ISI. Web of Knowledge: National Science Indicators, 1981–2001.
- WIPO. WIPO Gazette of International Marks: Statistical Supplement of 2001, 2002. URL: http://www.wipo.org/madrid/en/stat/pdf/stat2001.pdf.
- National Science Board. Science and Engineering Indicators—2002. National Science Foundation NSB-02-1, Arlington, VA, 2002. URL: http://www.nsf.gov/sbe/srs/seind02/start.htm.
- L. Groves, Now It Can Be Told: The Story of the Manhattan Project, De Capo Press, New York, 1962.
- H.A. Bethe, H.S. Brown, F. Seitz, F. Szilard, The Facts About the Hydrogen Bomb (University of Chicago Round Table), Univ. of Chicago Press, Chicago, 1950.
- B. Joy, Why the Future Doesn’t Need Us. Wired magazine 2000; April 2000. URL: http://www.wired.com/wired/archive/8.04/joy—pr.html.
- T. Kaczynski, The Unabomber’s Manifesto: Industrial Society and Its Future, 1995. URL: http://www.unabombertrial.com/manifesto/.
- J. Campbell, Weapons of Mass Destruction & Terrorism, Inter-Pact Press, Seminole, 1997.
- R. Heinlein, Solution unsatisfactory, in: Expanded Universe, Ace Books, New York: 1980, 96–144.
- J.P. Magalhães, The one-man rule, The Futurist 36 (6) (2002) 41–45.