The Singularity Is Near_ When Humans Transcend Biology - Part 5
Library

Part 5

While human neurons are wondrous creations, we wouldn't (and don't) design computing circuits using the same slow methods. Despite the ingenuity of the designs evolved through natural selection, they are many orders of magnitude less capable than what we will be able to engineer. As we reverse engineer our bodies and brains, we will be in a position to create comparable systems that are far more durable and that operate thousands to millions of times faster than our naturally evolved systems. Our electronic circuits are already more than one million times faster than a neuron's electrochemical processes, and this speed is continuing to accelerate.

Most of the complexity of a human neuron is devoted to maintaining its life-support functions, not its information-processing capabilities. Ultimately, we will be able to port our mental processes to a more suitable computational substrate. Then our minds won't have to stay so small.

The Limits of Computation

If a most efficient supercomputer works all day to compute a weather simulation problem, what is the minimum amount of energy that must be dissipated according to the laws of physics? The answer is actually very simple to calculate, since it is unrelated to the amount of computation. The answer is always equal to zero.-EDWARD FREDKIN, PHYSICIST45

We've already had five paradigms (electromechanical calculators, relay-based computing, vacuum tubes, discrete transistors, and integrated circuits) that have provided exponential growth to the price-performance and capabilities of computation. Each time a paradigm reached its limits, another paradigm took its place. We can already see the outlines of the sixth paradigm, which will bring computing into the molecular third dimension. Because computation underlies the foundations of everything we care about, from the economy to human intellect and creativity, we might well wonder: are there ultimate limits to the capacity of matter and energy to perform computation? If so, what are these limits, and how long will it take to reach them?

Our human intelligence is based on computational processes that we are learning to understand. We will ultimately multiply our intellectual powers by applying and extending the methods of human intelligence using the vastly greater capacity of nonbiological computation. So to consider the ultimate limits of computation is really to ask: what is the destiny of our civilization?

A common challenge to the ideas presented in this book is that these exponential trends must reach a limit, as exponential trends commonly do. When a species happens upon a new habitat, as in the famous example of rabbits in Australia, its numbers grow exponentially for a while. But it eventually reaches the limits of that environment's ability to support it. Surely the processing of information must have similar constraints. It turns out that, yes, there are limits to computation based on the laws of physics. But these still allow for a continuation of exponential growth until nonbiological intelligence is trillions of trillions of times more powerful than all of human civilization today, contemporary computers included.

A major factor in considering computational limits is the energy requirement. The energy required per MIPS for computing devices has been falling exponentially, as shown in the following figure.46 However, we also know that the number of MIPS in computing devices has been growing exponentially. The extent to which improvements in power usage have kept pace with processor speed depends on the extent to which we use parallel processing. A larger number of less-powerful computers can inherently run cooler because the computation is spread out over a larger area. Processor speed is related to voltage, and the power required is proportional to the square of the voltage. So running a processor at a slower speed significantly reduces power consumption. If we invest in more parallel processing rather than faster single processors, it is feasible for energy consumption and heat dissipation to keep pace with the growing MIPS per dollar, as the figure "Reduction in Watts per MIPS" shows.

This is essentially the same solution that biological evolution developed in the design of animal brains. Human brains use about one hundred trillion computers (the interneuronal connections, where most of the processing takes place). But these processors are very low in computational power and therefore run relatively cool.

Until just recently Intel emphasized the development of faster and faster single-chip processors, which have been running at increasingly high temperatures. Intel is gradually changing its strategy toward parallelization by putting multiple processors on a single chip. We will see chip technology move in this direction as a way of keeping power requirements and heat dissipation in check.47 Reversible Computing. Ultimately, organizing computation with ma.s.sive parallel processing, as is done in the human brain, will not by itself be sufficient to keep energy levels and resulting thermal dissipation at reasonable levels. The current computer paradigm relies on what is known as irreversible computing, meaning that we are unable in principle to run software programs backward. At each step in the progression of a program, the input data is discarded-erased-and the results of the computation pa.s.s to the next step. Programs generally do not retain all intermediate results, as that would use up large amounts of memory unnecessarily. This selective erasure of input information is particularly true for pattern-recognition systems. Vision systems, for example, whether human or machine, receive very high rates of input (from the eyes or visual sensors) yet produce relatively compact outputs (such as identification of recognized patterns). This act of erasing data generates heat and therefore requires energy. When a bit of information is erased, that information has to go somewhere. According to the laws of thermodynamics, the erased bit is essentially released into the surrounding environment, thereby increasing its entropy, which can be viewed as a measure of information (including apparently disordered information) in an environment. This results in a higher temperature for the environment (because temperature is a measure of entropy). Ultimately, organizing computation with ma.s.sive parallel processing, as is done in the human brain, will not by itself be sufficient to keep energy levels and resulting thermal dissipation at reasonable levels. The current computer paradigm relies on what is known as irreversible computing, meaning that we are unable in principle to run software programs backward. At each step in the progression of a program, the input data is discarded-erased-and the results of the computation pa.s.s to the next step. Programs generally do not retain all intermediate results, as that would use up large amounts of memory unnecessarily. This selective erasure of input information is particularly true for pattern-recognition systems. Vision systems, for example, whether human or machine, receive very high rates of input (from the eyes or visual sensors) yet produce relatively compact outputs (such as identification of recognized patterns). This act of erasing data generates heat and therefore requires energy. When a bit of information is erased, that information has to go somewhere. According to the laws of thermodynamics, the erased bit is essentially released into the surrounding environment, thereby increasing its entropy, which can be viewed as a measure of information (including apparently disordered information) in an environment. This results in a higher temperature for the environment (because temperature is a measure of entropy).

If, on the other hand, we don't erase each bit of information contained in the input to each step of an algorithm but instead just move it to another location, that bit stays in the computer, is not released into the environment, and therefore generates no heat and requires no energy from outside the computer.

Rolf Landauer showed in 1961 that reversible logical operations such as NOT (turning a bit into its opposite) could be performed without putting energy in or taking heat out, but that irreversible logical operations such as AND (generating bit C, which is a 1 if and only if both inputs A and Bare 1) do require energy.48 In 1973 Charles Bennett showed that any computation could be performed using only reversible logical operations. In 1973 Charles Bennett showed that any computation could be performed using only reversible logical operations.49 A decade later, Ed Fredkin and Tommaso Toffoli presented a comprehensive review of the idea of reversible computing. A decade later, Ed Fredkin and Tommaso Toffoli presented a comprehensive review of the idea of reversible computing.50 The fundamental concept is that if you keep all the intermediate results and then run the algorithm backward when you've finished your calculation, you end up where you started, have used no energy, and generated no heat. Along the way, however, you've calculated the result of the algorithm. The fundamental concept is that if you keep all the intermediate results and then run the algorithm backward when you've finished your calculation, you end up where you started, have used no energy, and generated no heat. Along the way, however, you've calculated the result of the algorithm.

How Smart Is a Rock? To appreciate the feasibility of computing with no energy and no heat, consider the computation that takes place in an ordinary rock. Although it may appear that nothing much is going on inside a rock, the approximately 10 To appreciate the feasibility of computing with no energy and no heat, consider the computation that takes place in an ordinary rock. Although it may appear that nothing much is going on inside a rock, the approximately 1025 (ten trillion trillion) atoms in a kilogram of matter are actually extremely active. Despite the apparent solidity of the object, the atoms are all in motion, sharing electrons back and forth, changing particle spins, and generating rapidly moving electromagnetic fields. All of this activity represents computation, even if not very meaningfully organized. (ten trillion trillion) atoms in a kilogram of matter are actually extremely active. Despite the apparent solidity of the object, the atoms are all in motion, sharing electrons back and forth, changing particle spins, and generating rapidly moving electromagnetic fields. All of this activity represents computation, even if not very meaningfully organized.

We've already shown that atoms can store information at a density of greater than one bit per atom, such as in computing systems built from nuclear magnetic-resonance devices. University of Oklahoma researchers stored 1,024 bits in the magnetic interactions of the protons of a single molecule containing nineteen hydrogen atoms.51 Thus, the state of the rock at anyone moment represents at least 10 Thus, the state of the rock at anyone moment represents at least 1027 bits of memory. bits of memory.

In terms of computation, and just considering the electromagnetic interactions, there are at least 1015 changes in state per bit per second going on inside a 2.2-pound rock, which effectively represents about 10 changes in state per bit per second going on inside a 2.2-pound rock, which effectively represents about 1042 (a million trillion trillion trillion) calculations per second. Yet the rock requires no energy input and generates no appreciable heat. (a million trillion trillion trillion) calculations per second. Yet the rock requires no energy input and generates no appreciable heat.

Of course, despite all this activity at the atomic level, the rock is not performing any useful work aside from perhaps acting as a paperweight or a decoration. The reason for this is that the structure of the atoms in the rock is for the most part effectively random. If, on the other hand, we organize the particles in a more purposeful manner, we could have a cool, zero-energy-consuming computer with a memory of about a thousand trillion trillion bits and a processing capacity of 1042 operations per second, which is about ten trillion times more powerful than all human brains on Earth, even if we use the most conservative (highest) estimate of 10 operations per second, which is about ten trillion times more powerful than all human brains on Earth, even if we use the most conservative (highest) estimate of 1019 cps. cps.52 Ed Fredkin demonstrated that we don't even have to bother running algorithms in reverse after obtaining a result.53 Fredkin presented several designs for reversible logic gates that perform the reversals as they compute and that are universal, meaning that general-purpose computation can be built from them. Fredkin presented several designs for reversible logic gates that perform the reversals as they compute and that are universal, meaning that general-purpose computation can be built from them.54 Fredkin goes on to show that the efficiency of a computer built from reversible logic gates can be designed to be very close (at least 99 percent) to the efficiency of ones built from irreversible gates. He writes: Fredkin goes on to show that the efficiency of a computer built from reversible logic gates can be designed to be very close (at least 99 percent) to the efficiency of ones built from irreversible gates. He writes:

it is possible to ... implement ... conventional computer models that have the distinction that the basic components are microscopically . reversible. This means that the macroscopic operation of the computer is also reversible. This fact allows us to address the ... question ... "what is required for a computer to be maximally efficient?" The answer is that if the computer is built out of microscopically reversible components then it can be perfectly efficient. How much energy does a perfectly efficient computer have to dissipate in order to compute something? The answer is that the computer does not need to dissipate any energy.55

Reversible logic has already been demonstrated and shows the expected reductions in energy input and heat dissipation.56 Fredkin's reversible logic gates answer a key challenge to the idea of reversible computing: that it would require a different style of programming. He argues that we can, in fact, construct normal logic and memory entirely from reversible logic gates, which will allow the use of existing conventional software-development methods. Fredkin's reversible logic gates answer a key challenge to the idea of reversible computing: that it would require a different style of programming. He argues that we can, in fact, construct normal logic and memory entirely from reversible logic gates, which will allow the use of existing conventional software-development methods.

It is hard to overstate the significance of this insight. A key observation regarding the Singularity is that information processes-computation-will ultimately drive everything that is important. This primary foundation for future technology thus appears to require no energy.

The practical reality is slightly more complicated. If we actually want to find out the results of a computation-that is, to receive output from a computer-the process of copying the answer and transmitting it outside of the computer is an irreversible process, one that generates heat for each bit transmitted. However, for most applications of interest, the amount of computation that goes into executing an algorithm vastly exceeds the computation required to communicate the final answers, so the latter does not appreciably change the energy equation.

However, because of essentially random thermal and quantum effects, logic operations have an inherent error rate. We can overcome errors using error-detection and-correction codes, but each time we correct a bit, the operation is not reversible, which means it requires energy and generates heat. Generally, error rates are low. But even if errors occur at the rate of, say, one per 1010 operations, we have only succeeded in reducing energy requirements by a factor of 10 operations, we have only succeeded in reducing energy requirements by a factor of 1010, not in eliminating energy dissipation altogether.

As we consider the limits of computation, the issue of error rate becomes a significant design issue. Certain methods of increasing computational rate, such as increasing the frequency of the oscillation of particles, also increase error rates, so this puts natural limits on the ability to perform computation using matter and energy.

Another important trend with relevance here will be the moving away from conventional batteries toward tiny fuel cells (devices storing energy in chemicals, such as forms of hydrogen, which is combined with available oxygen). Fuel cells are already being constructed using MEMS (microelectronic mechanical systems) technology.57 As we move toward three-dimensional, molecular computing with nanoscale features, energy resources in the form of nano-fuel cells will be as widely distributed throughout the computing medium among the ma.s.sively parallel processors. We will discuss future nanotechnology-based energy technologies in chapter 5. As we move toward three-dimensional, molecular computing with nanoscale features, energy resources in the form of nano-fuel cells will be as widely distributed throughout the computing medium among the ma.s.sively parallel processors. We will discuss future nanotechnology-based energy technologies in chapter 5.

The Limits of Nanocomputing. Even with the restrictions we have discussed, the ultimate limits of computers are profoundly high. Building on work by University of California at Berkeley professor Hans Bremermann and nanotechnology theorist Robert Freitas, MIT professor Seth Lloyd has estimated the maximum computational capacity, according to the known laws of physics, of a computer weighing one kilogram and occupying one liter of volume-about the size and weight of a small laptop computer-what he calls the "ultimate laptop." Even with the restrictions we have discussed, the ultimate limits of computers are profoundly high. Building on work by University of California at Berkeley professor Hans Bremermann and nanotechnology theorist Robert Freitas, MIT professor Seth Lloyd has estimated the maximum computational capacity, according to the known laws of physics, of a computer weighing one kilogram and occupying one liter of volume-about the size and weight of a small laptop computer-what he calls the "ultimate laptop."58 The potential amount of computation rises with the available energy. We can understand the link between energy and computational capacity as follows. The energy in a quant.i.ty of matter is the energy a.s.sociated with each atom (and subatomic particle). So the more atoms, the more energy. As discussed above, each atom can potentially be used for computation. So the more atoms, the more computation. The energy of each atom or particle grows with the frequency of its movement: the more movement, the more energy. The same relationship exists for potential computation: the higher the frequency of movement, the more computation each component (which can be an atom) can perform. (We see this in contemporary chips: the higher the frequency of the chip, the greater its computational speed.) The potential amount of computation rises with the available energy. We can understand the link between energy and computational capacity as follows. The energy in a quant.i.ty of matter is the energy a.s.sociated with each atom (and subatomic particle). So the more atoms, the more energy. As discussed above, each atom can potentially be used for computation. So the more atoms, the more computation. The energy of each atom or particle grows with the frequency of its movement: the more movement, the more energy. The same relationship exists for potential computation: the higher the frequency of movement, the more computation each component (which can be an atom) can perform. (We see this in contemporary chips: the higher the frequency of the chip, the greater its computational speed.) So there is a direct proportional relationship between the energy of an object and its potential to perform computation. The potential energy in a kilogram of matter is very large, as we know from Einstein's equation E E = = mc mc2, The speed of light squared is a very large number: approximately 1017 meter meter2/second2. The potential of matter to compute is also governed by a very small number, Planck's constant: 6.6 i 10-34 joule-seconds (a joule is a measure of energy). This is the smallest scale at which we can apply energy for computation. We obtain the theoretical limit of an object to perform computation by dividing the total energy (the average energy of each atom or particle times the number of such particles) by Planck's constant. joule-seconds (a joule is a measure of energy). This is the smallest scale at which we can apply energy for computation. We obtain the theoretical limit of an object to perform computation by dividing the total energy (the average energy of each atom or particle times the number of such particles) by Planck's constant.

Lloyd shows how the potential computing capacity of a kilogram of matter equals pi times energy divided by Planck's constant. Since the energy is such a large number and Planck's constant is so small, this equation generates an extremely large number: about 5 i 1050 operations per second. operations per second.59 If we relate that figure to the most conservative estimate of human brain capacity (1019 cps and 10 cps and 1010 humans), it represents the equivalent of about five billion trillion human civilizations. humans), it represents the equivalent of about five billion trillion human civilizations.60 If we use the figure of 10 If we use the figure of 1016 cps that I believe will be sufficient for functional emulation of human intelligence, the ultimate laptop would function at the equivalent brain power of five trillion trillion human civilizations. cps that I believe will be sufficient for functional emulation of human intelligence, the ultimate laptop would function at the equivalent brain power of five trillion trillion human civilizations.61 Such a laptop could perform the equivalent of all human thought over the last ten thousand years (that is, ten billion human brains operating for ten thousand years) in one ten-thousandth of a nanosecond. Such a laptop could perform the equivalent of all human thought over the last ten thousand years (that is, ten billion human brains operating for ten thousand years) in one ten-thousandth of a nanosecond.62 Again, a few caveats are in order. Converting all of the ma.s.s of our 2.2-pound laptop into energy is essentially what happens in a thermonuclear explosion. Of course, we don't want the laptop to explode but to stay within its one-liter dimension. So this will require some careful packaging, to say the least. By a.n.a.lyzing the maximum entropy (degrees of freedom represented by the state of all the particles) in such a device, Lloyd shows that such a computer would have a theoretical memory capacity of 1031 bits. It's difficult to imagine technologies that would go all the way in achieving these limits. But we can readily envision technologies that come reasonably close to doing so. As the University of Oklahoma project shows, we already demonstrated the ability to store at least fifty bits of information per atom (although only on a small number of atoms, so far). Storing 10 bits. It's difficult to imagine technologies that would go all the way in achieving these limits. But we can readily envision technologies that come reasonably close to doing so. As the University of Oklahoma project shows, we already demonstrated the ability to store at least fifty bits of information per atom (although only on a small number of atoms, so far). Storing 1027 bits of memory in the 10 bits of memory in the 1025 atoms in a kilogram of matter should therefore be eventually achievable. atoms in a kilogram of matter should therefore be eventually achievable.

But because many properties of each atom could be exploited to store information-such as the precise position, spin, and quantum state of all of its particles-we can probably do somewhat better than 1027 bits. Neuroscientist Anders Sandberg estimates the potential storage capacity of a hydrogen atom at about four million bits. These densities have not yet been demonstrated, however, so we'll use the more conservative estimate. bits. Neuroscientist Anders Sandberg estimates the potential storage capacity of a hydrogen atom at about four million bits. These densities have not yet been demonstrated, however, so we'll use the more conservative estimate.63 As discussed above, 10 As discussed above, 1042 calculations per second could be achieved without producing significant heat. By fully deploying reversible computing techniques, using designs that generate low levels of errors, and allowing for reasonable amounts of energy dissipation, we should end up somewhere between 10 calculations per second could be achieved without producing significant heat. By fully deploying reversible computing techniques, using designs that generate low levels of errors, and allowing for reasonable amounts of energy dissipation, we should end up somewhere between 1042 and 10 and 1050 calculations per second. calculations per second.

The design terrain between these two limits is complex. Examining the technical issues that arise as we advance from 1042 to 10 to 1050 is beyond the scope of this chapter. We should keep in mind, however, that the way this will play out is not by starting with the ultimate limit of 10 is beyond the scope of this chapter. We should keep in mind, however, that the way this will play out is not by starting with the ultimate limit of 1050 and working backward based on various practical considerations. Rather, technology will continue to ramp up, always using its latest prowess to progress to the next level. So once we get to a civilization with 10 and working backward based on various practical considerations. Rather, technology will continue to ramp up, always using its latest prowess to progress to the next level. So once we get to a civilization with 1042 cps (for every 2.2 pounds), the scientists and engineers of that day will use their essentially vast nonbiological intelligence to figure out how to get 10 cps (for every 2.2 pounds), the scientists and engineers of that day will use their essentially vast nonbiological intelligence to figure out how to get 1043, then 1044, and so on. My expectation is that we will get very close to the ultimate limits.

Even at 1042 cps, a 2.2-pound "ultimate portable computer" would be able to perform the equivalent of all human thought over the last ten thousand years (a.s.sumed at ten billion human brains for ten thousand years) in ten microseconds. cps, a 2.2-pound "ultimate portable computer" would be able to perform the equivalent of all human thought over the last ten thousand years (a.s.sumed at ten billion human brains for ten thousand years) in ten microseconds.64 If we examine the "Exponential Growth of Computing" chart (p. 70), we see that this amount of computing is estimated to be available for one thousand dollars by 2080. If we examine the "Exponential Growth of Computing" chart (p. 70), we see that this amount of computing is estimated to be available for one thousand dollars by 2080.

A more conservative but compelling design for a ma.s.sively parallel, reversible computer is Eric Drexler's patented nanocomputer design, which is entirely mechanical.65 Computations are performed by manipulating nanoscale rods, which are effectively spring-loaded. After each calculation, the rods containing intermediate values return to their original positions, thereby implementing the reverse computation. The device has a trillion (10 Computations are performed by manipulating nanoscale rods, which are effectively spring-loaded. After each calculation, the rods containing intermediate values return to their original positions, thereby implementing the reverse computation. The device has a trillion (1012) processors and provides an overall rate of 1021 cps, enough to simulate one hundred thousand human brains in a cubic centimeter. cps, enough to simulate one hundred thousand human brains in a cubic centimeter.

Setting a Date for the Singularity. A more modest but still profound threshold will be achieved much earlier. In the early 2030s one thousand dollars' worth of computation will buy about 10 A more modest but still profound threshold will be achieved much earlier. In the early 2030s one thousand dollars' worth of computation will buy about 1017 cps (probably around 10 cps (probably around 1020 cps using ASICs and harvesting distributed computation via the Internet). Today we spend more than $10 cps using ASICs and harvesting distributed computation via the Internet). Today we spend more than $1011 ($100 billion) on computation in a year, which will conservatively rise to $10 ($100 billion) on computation in a year, which will conservatively rise to $1012 ($1 trillion) by 2030. So we will be producing about 10 ($1 trillion) by 2030. So we will be producing about 1026 to 10 to 1029 cps of nonbiological computation per year in the early 2030s. This is roughly equal to our estimate for the capacity of all living biological human intelligence. cps of nonbiological computation per year in the early 2030s. This is roughly equal to our estimate for the capacity of all living biological human intelligence.

Even if just equal in capacity to our own brains, this nonbiological portion of our intelligence will be more powerful because it will combine the pattern-recognition powers of human intelligence with the memory- and skill-sharing ability and memory accuracy of machines. The nonbiological portion will always operate at peak capacity, which is far from the case for biological humanity today; the 1026 cps represented by biological human civilization today is poorly utilized. cps represented by biological human civilization today is poorly utilized.

This state of computation in the early 2030s will not represent the Singularity, however, because it does not yet correspond to a profound expansion of our intelligence. By the mid-2040s, however, that one thousand dollars' worth of computation will be equal to 1026 cps, so the intelligence created per year (at a total cost of about $10 cps, so the intelligence created per year (at a total cost of about $1012) will be about one billion times more powerful than all human intelligence today.66

[image]That will will indeed represent a profound change, and it is for that reason that I set the date for the Singularity-representing a profound and disruptive transformation in human capability-as 2045. indeed represent a profound change, and it is for that reason that I set the date for the Singularity-representing a profound and disruptive transformation in human capability-as 2045.

Despite the clear predominance of nonbiological intelligence by the mid-2040s, ours will still be a human civilization. We will transcend biology, but not our humanity. I'll return to this issue in chapter 7.

Returning to the limits of computation according to physics, the estimates above were expressed in terms of laptop-size computers because that is a familiar form factor today. By the second decade of this century, however, most computing will not be organized in such rectangular devices but will be highly distributed throughout the environment. Computing will be everywhere: in the walls, in our furniture, in our clothing, and in our bodies and brains.

And, of course, human civilization will not be limited to computing with just a few pounds of matter. In chapter 6, we'll examine the computational potential of an Earth-size planet and computers on the scale of solar systems, of galaxies, and of the entire known universe. As we will see, the amount of time required for our human civilization to achieve scales of computation and intelligence that go beyond our planet and into the universe may be a lot shorter than you might think. I set the date for the Singularity-representing a profound and disruptive transformation in human capability-as 2045. The non biological intelligence created in that year will be one billion times more powerful than all human intelligence today.

Memory and Computational Efficiency: A Rock Versus a Human Brain. With the limits of matter and energy to perform computation in mind, two useful metrics are the memory efficiency and computational efficiency of an object. These are defined as the fractions of memory and computation taking place in an object that are actually useful. Also, we need to consider the equivalence principle: even if computation is useful, if a simpler method produces equivalent results, then we should evaluate the computation against the simpler algorithm. In other words, if two methods achieve the same result but one uses more computation than the other, the more computationally intensive method will be considered to use only the amount of computation of the less intensive method. A Rock Versus a Human Brain. With the limits of matter and energy to perform computation in mind, two useful metrics are the memory efficiency and computational efficiency of an object. These are defined as the fractions of memory and computation taking place in an object that are actually useful. Also, we need to consider the equivalence principle: even if computation is useful, if a simpler method produces equivalent results, then we should evaluate the computation against the simpler algorithm. In other words, if two methods achieve the same result but one uses more computation than the other, the more computationally intensive method will be considered to use only the amount of computation of the less intensive method.67 The purpose of these comparisons is to a.s.sess just how far biological evolution has been able to go from systems with essentially no intelligence (that is, an ordinary rock, which performs no useful computation) to the ultimate ability of matter to perform purposeful computation. Biological evolution took us part of the way, and technological evolution (which, as I pointed out earlier, represents a continuation of biological evolution) will take us very close to those limits.

Recall that a 2.2-pound rock has on the order of 1027 bits of information encoded in the state of its atoms and about 10 bits of information encoded in the state of its atoms and about 1042 cps represented by the activity of its particles. Since we are talking about an ordinary stone, a.s.suming that its surface could store about one thousand bits is a perhaps arbitrary but generous estimate. cps represented by the activity of its particles. Since we are talking about an ordinary stone, a.s.suming that its surface could store about one thousand bits is a perhaps arbitrary but generous estimate.68 This represents 10 This represents 1024 of its theoretical capacity, or a memory efficiency of 10 of its theoretical capacity, or a memory efficiency of 1024.69 We can also use a stone to do computation. For example, by dropping the stone from a particular height, we can compute the amount of time it takes to drop an object from that height. Of course, this represents very little computation: perhaps 1 cps, meaning its computational efficiency is 1042.70 In comparison, what can we say about the efficiency of the human brain? Earlier in this chapter we discussed how each of the approximately 1014 interneuronal connections can store an estimated 10 interneuronal connections can store an estimated 104 bits in the connection's neurotransmitter concentrations and synaptic and dendritic nonlinearities (specific shapes), for a total of 10 bits in the connection's neurotransmitter concentrations and synaptic and dendritic nonlinearities (specific shapes), for a total of 1018 bits. The human brain weighs about the same as our stone (actually closer to 3 pounds than 2.2, but since we're dealing with orders of magnitude, the measurements are close enough). It runs warmer than a cold stone, but we can still use the same estimate of about 10 bits. The human brain weighs about the same as our stone (actually closer to 3 pounds than 2.2, but since we're dealing with orders of magnitude, the measurements are close enough). It runs warmer than a cold stone, but we can still use the same estimate of about 1027 bits of theoretical memory capacity (estimating that we can store one bit in each atom). This results in a memory efficiency of 10 bits of theoretical memory capacity (estimating that we can store one bit in each atom). This results in a memory efficiency of 109. However, by the equivalence principle, we should not use the brain's inefficient coding methods to rate its memory efficiency. Using our functional memory estimate above of 1013 bits, we get a memory efficiency of 10 bits, we get a memory efficiency of 1014. That's about halfway between the stone and the ultimate cold laptop on a logarithmic scale. However, even though technology progresses exponentially, our experiences are in a linear world, and on a linear scale the human brain is far closer to the stone than to the ultimate cold computer.

So what is the brain's computational efficiency? Again, we need to consider the equivalence principle and use the estimate of 1016 cps required to emulate the brain's functionality, rather than the higher estimate (10 cps required to emulate the brain's functionality, rather than the higher estimate (1019 cps) required to emulate all of the nonlinearities in every neuron. With the theoretical capacity of the brain's atoms estimated at 10 cps) required to emulate all of the nonlinearities in every neuron. With the theoretical capacity of the brain's atoms estimated at 1042 cps, this gives us a computational efficiency of 10 cps, this gives us a computational efficiency of 1026. Again, that's closer to a rock than to the laptop, even on a logarithmic scale.

Our brains have evolved significantly in their memory and computational efficiency from pre-biology objects such as stones. But we clearly have many orders of magnitude of improvement to take advantage of during the first half of this century.

Going Beyond the Ultimate: Pico- and Femtotechnology and Bending the Speed of Light. The limits of around 10 The limits of around 1042 cps for a one-kilogram, one-liter cold computer and around 10 cps for a one-kilogram, one-liter cold computer and around 1050 for a (very) hot one are based on computing with atoms. But limits are not always what they seem. New scientific understanding has a way of pushing apparent limits aside. As one of many such examples, early in the history of aviation, a consensus a.n.a.lysis of the limits of jet propulsion apparently demonstrated that jet aircraft were infeasible. for a (very) hot one are based on computing with atoms. But limits are not always what they seem. New scientific understanding has a way of pushing apparent limits aside. As one of many such examples, early in the history of aviation, a consensus a.n.a.lysis of the limits of jet propulsion apparently demonstrated that jet aircraft were infeasible.71 The limits I discussed above represent the limits of nanotechnology based on our current understanding. But what about picotechnology, measured in trillionths (1012) of a meter, and femtotechnology, scales of 1015 of a meter? At these scales, we would require computing with subatomic particles. With such smaller size comes the potential for even greater speed and density. of a meter? At these scales, we would require computing with subatomic particles. With such smaller size comes the potential for even greater speed and density.

We do have at least several very early-adopter picoscale technologies. German scientists have created an atomic-force microscope (AFM) that can resolve features of an atom that are only seventy-seven picometers across72 An even higher-resolution technology has been created by scientists at the University of California at Santa Barbara, who have developed an extremely sensitive measurement detector with a physical beam made of gallium-a.r.s.enide crystal and a sensing system that can measure a flexing of the beam of as little as one picometer. The device is intended to provide a test of Heisenberg's uncertainty principle. An even higher-resolution technology has been created by scientists at the University of California at Santa Barbara, who have developed an extremely sensitive measurement detector with a physical beam made of gallium-a.r.s.enide crystal and a sensing system that can measure a flexing of the beam of as little as one picometer. The device is intended to provide a test of Heisenberg's uncertainty principle.73 In the time dimension Cornell University scientists have demonstrated an imaging technology based on X-ray scattering that can record movies of the movement of a single electron. Each frame represents only four attoseconds (1018 seconds, each one a billionth of a billionth of a second. seconds, each one a billionth of a billionth of a second.74 The device can achieve spatial resolution of one angstrom (10 The device can achieve spatial resolution of one angstrom (1010 meter, which is 100 picometers). meter, which is 100 picometers).

However, our understanding of matter at these scales, particularly in the femtometer range, is not sufficiently well developed to propose computing paradigms. An Engines of Creation Engines of Creation (Eric Drexler's seminal 1986 book that provided the foundations for nanotechnology) for pico- or femtotechnology has not yet been written. However, each of the competing theories for the behavior of matter and energy at these scales is based on mathematical models that are based on computable transformations. Many of the transformations in physics do provide the basis for universal computation (that is, transformations from which we can build general-purpose computers), and it may be that behavior in the pico- and femtometer range will do so as well. (Eric Drexler's seminal 1986 book that provided the foundations for nanotechnology) for pico- or femtotechnology has not yet been written. However, each of the competing theories for the behavior of matter and energy at these scales is based on mathematical models that are based on computable transformations. Many of the transformations in physics do provide the basis for universal computation (that is, transformations from which we can build general-purpose computers), and it may be that behavior in the pico- and femtometer range will do so as well.

Of course, even if the basic mechanisms of matter in these ranges provide for universal computation in theory, we would still have to devise the requisite engineering to create ma.s.sive numbers of computing elements and learn how to control them. These are similar to the challenges on which we are now rapidly making progress in the field of nanotechnology. At this time, we have to regard the feasibility of pico- and femtocomputing as speculative. But nanocomputing will provide ma.s.sive levels of intelligence, so if it's at all possible to do, our future intelligence will be likely to figure out the necessary processes. The mental experiment we should be making is not whether humans as we know them today will be capable of engineering pico- and femtocomputing technologies, but whether the vast intelligence of future nanotechnology-based intelligence (which will be trillions of trillions of times more capable than contemporary biological human intelligence) will be capable of rendering these designs. Although I believe it is likely that our future nanotechnology-based intelligence will be able to engineer computation at scales finer than nanotechnology, the projections in this book concerning the Singularity do not rely on this speculation.

In addition to making computing smaller, we can make it bigger-that is, we can replicate these very small devices on a ma.s.sive scale. With full-scale nanotechnology, computing resources can be made self-replicating and thus can rapidly convert ma.s.s and energy into an intelligent form. However, we run up against the speed of light, because the matter in the universe is spread out over vast distances.

As we will discuss later, there are at least suggestions that the speed of light may not be immutable. Physicists Steve Lamoreaux and Justin Torgerson of the Los Alamos National Laboratory have a.n.a.lyzed data from an old natural nuclear reactor that two billion years ago produced a fission reaction lasting several hundred thousand years in what is now West Africa.75 Examining radioactive isotopes left over from the reactor and comparing them to isotopes from similar nuclear reactions today, they determined that the physics constant alpha (also called the fine-structure constant), which determines the strength of the electromagnetic force, apparently has changed over two billion years. This is of great significance to the world of physics, because the speed of light is inversely proportional to alpha, and both have been considered unchangeable constants. Alpha appears to have decreased by 4.5 parts out of 10 Examining radioactive isotopes left over from the reactor and comparing them to isotopes from similar nuclear reactions today, they determined that the physics constant alpha (also called the fine-structure constant), which determines the strength of the electromagnetic force, apparently has changed over two billion years. This is of great significance to the world of physics, because the speed of light is inversely proportional to alpha, and both have been considered unchangeable constants. Alpha appears to have decreased by 4.5 parts out of 108. If confirmed, this would imply that the speed of light has increased.

Of course, these exploratory results will need to be carefully verified. If true, they may hold great importance for the future of our civilization. If the speed of light has increased, it has presumably done so not just as a result of the pa.s.sage of time but because certain conditions have changed. If the speed of light has changed due to changing circ.u.mstances, that cracks open the door just enough for the vast powers of our future intelligence and technology to swing the door widely open. This is the type of scientific insight that technologists can exploit. Human engineering often takes a natural, frequently subtle, effect, and controls it with a view toward greatly leveraging and magnifying it.

Even if we find it difficult to significantly increase the speed of light over the long distances of s.p.a.ce, doing so within the small confines of a computing device would also have important consequences for extending the potential for computation. The speed of light is one of the limits that constrain computing devices even today, so the ability to boost it would extend further the limits of computation. We will explore several other intriguing approaches to possibly increasing, or circ.u.mventing, the speed of light in chapter 6. Expanding the speed of light is, of course, speculative today, and none of the a.n.a.lyses underlying our expectation of the Singularity rely on this possibility.

Going Back in Time. Another intriguing-and highly speculative-possibility is to send a computational process back in time through a "wormhole" in s.p.a.ce-time. Theoretical physicist Todd Brun of the Inst.i.tute for Advanced Studies at Princeton has a.n.a.lyzed the possibility of computing using what he calls a "closed timelike curve" (CTC). According to Brun, CTCs could "send information (such as the result of calculations) into their own past light cones." Another intriguing-and highly speculative-possibility is to send a computational process back in time through a "wormhole" in s.p.a.ce-time. Theoretical physicist Todd Brun of the Inst.i.tute for Advanced Studies at Princeton has a.n.a.lyzed the possibility of computing using what he calls a "closed timelike curve" (CTC). According to Brun, CTCs could "send information (such as the result of calculations) into their own past light cones."76 Brun does not provide a design for such a device but establishes that such a system is consistent with the laws of physics. His time-traveling computer also does not create the "grandfather paradox," often cited in discussions of time travel. This well-known paradox points out that if person A goes back in time, he could kill his grandfather, causing A not to exist, resulting in his grandfather not being killed by him, so A would exist and thus could go back and kill his grandfather, and so on, ad infinitum.

Brun's time-stretching computational process does not appear to introduce this problem because it does not affect the past. It produces a determinate and unambiguous answer in the present to a posed question. The question must have a dear answer, and the answer is not presented until after the question is asked, although the process to determine the answer can take place before the question is asked using the CTC. Conversely, the process could take place after the question is asked and then use a CTC to bring the answer back into the present (but not before the question was asked, because that would introduce the grandfather paradox). There may very well be fundamental barriers (or limitations) to such a process that we don't yet understand, but those barriers have yet to be identified. If feasible, it would greatly expand the potential of local computation. Again, all of my estimates of computational capacities and of the capabilities of the Singularity do not rely on Brun's tentative conjecture.

ERIC DREXLER: I don't know, Ray. I'm pessimistic on the prospects for picotechnology. With the stable particles we know of, I don't see how there can be picoscale structure without the enormous pressures found in a collapsed star-a white dwarf or a neutron star-and then you would get a solid chunk of stuff like a metal, but a million times denser. This doesn't seem very useful, even if it were possible to make it in our solar system. If physics included a stable particle like an electron but a hundred times more ma.s.sive, it would be a different story, but we don't know of one. I don't know, Ray. I'm pessimistic on the prospects for picotechnology. With the stable particles we know of, I don't see how there can be picoscale structure without the enormous pressures found in a collapsed star-a white dwarf or a neutron star-and then you would get a solid chunk of stuff like a metal, but a million times denser. This doesn't seem very useful, even if it were possible to make it in our solar system. If physics included a stable particle like an electron but a hundred times more ma.s.sive, it would be a different story, but we don't know of one.

RAY: We manipulate subatomic particles today with accelerators that fall significantly short of the conditions in a neutron star. Moreover, we manipulate subatomic particles such as electrons today with tabletop devices. Scientists recently captured and stopped a photon dead in its tracks. We manipulate subatomic particles today with accelerators that fall significantly short of the conditions in a neutron star. Moreover, we manipulate subatomic particles such as electrons today with tabletop devices. Scientists recently captured and stopped a photon dead in its tracks.

ERIC: Yes, but what kind of manipulation? If we count manipulating small particles, then all technology is already picotechnology, because all matter is made of subatomic particles. Smashing particles together in accelerators produces debris, not machines or circuits. Yes, but what kind of manipulation? If we count manipulating small particles, then all technology is already picotechnology, because all matter is made of subatomic particles. Smashing particles together in accelerators produces debris, not machines or circuits.

RAY: I didn't say we've solved the conceptual problems of picotechnology. I've got you penciled in to do that in 2072. I didn't say we've solved the conceptual problems of picotechnology. I've got you penciled in to do that in 2072.

ERIC: Oh, good, then I see you have me living a long time. Oh, good, then I see you have me living a long time.

RAY: Yes, well, if you stay on the sharp leading edge of health and medical insights and technology, as I'm trying to do, I see you being in rather good shape around then. Yes, well, if you stay on the sharp leading edge of health and medical insights and technology, as I'm trying to do, I see you being in rather good shape around then.

MOLLY 2104: Yes, quite a few of you baby boomers did make it through. But most were unmindful of the opportunities in 2004 to extend human mortality long enough to take advantage of the biotechnology revolution, which hit its stride a decade later, followed by nanotechnology a decade after that. Yes, quite a few of you baby boomers did make it through. But most were unmindful of the opportunities in 2004 to extend human mortality long enough to take advantage of the biotechnology revolution, which hit its stride a decade later, followed by nanotechnology a decade after that.

MOLLY 2004: So, Molly 2104, you must be quite something, considering that one thousand dollars of computation in 2080 can perform the equivalent of ten billion human brains thinking for ten thousand years in a matter of ten microseconds. That presumably will have progressed even further by 2104, and I a.s.sume you have access to more than one thousand dollars' worth of computation. So, Molly 2104, you must be quite something, considering that one thousand dollars of computation in 2080 can perform the equivalent of ten billion human brains thinking for ten thousand years in a matter of ten microseconds. That presumably will have progressed even further by 2104, and I a.s.sume you have access to more than one thousand dollars' worth of computation.

MOLLY 2104: Actually, millions of dollars on average-billions when I need it. Actually, millions of dollars on average-billions when I need it.

MOLLY 2004: That's pretty hard to imagine. That's pretty hard to imagine.

MOLLY 2104: Yeah, well, I guess I'm kind of smart when I need to be. Yeah, well, I guess I'm kind of smart when I need to be.

MOLLY 2004: You don't sound that bright, actually. You don't sound that bright, actually.

MOLLY 2104: I'm trying to relate on your level. I'm trying to relate on your level.

MOLLY 2004: Now, wait a second, Miss Molly of the future.... Now, wait a second, Miss Molly of the future....

GEORGE 2048: Ladies, please, you're both very engaging. Ladies, please, you're both very engaging.

MOLLY 2004: Yes, well, tell that to my counterpart here-she feels she's a jillion times more capable than I am. Yes, well, tell that to my counterpart here-she feels she's a jillion times more capable than I am.

GEORGE 2048: She is your future, you know. Anyway, I've always felt there was something special about a biological woman. She is your future, you know. Anyway, I've always felt there was something special about a biological woman.

MOLLY 2104: Yeah, what would you know about biological women anyway? Yeah, what would you know about biological women anyway? GEORGE 2048: GEORGE 2048: I've read a great deal about it and engaged in some very precise simulations. I've read a great deal about it and engaged in some very precise simulations.

MOLLY 2004: It occurs to me that maybe you're both missing something that you're not aware of It occurs to me that maybe you're both missing something that you're not aware of GEORGE 2048: GEORGE 2048: I don't see how that's possible. I don't see how that's possible.

MOLLY 2104: Definitely not. Definitely not.

MOLLY 2004: I didn't think you would. But there is one thing I understand you can do that I do find cool. I didn't think you would. But there is one thing I understand you can do that I do find cool.

MOLLY 2104: Just one? Just one? MOLLY 2004: MOLLY 2004: One that I'm thinking of, anyway. You can merge your thinking with someone else and still keep your separate ident.i.ty at the same time. One that I'm thinking of, anyway. You can merge your thinking with someone else and still keep your separate ident.i.ty at the same time.

MOLLY 2104: If the situation-and the person-is right, then, yes, it's a very sublime thing to do. If the situation-and the person-is right, then, yes, it's a very sublime thing to do.

MOLLY 2004: Like falling in love? Like falling in love?

MOLLY 2104: Like being in love. It's the ultimate way to share. Like being in love. It's the ultimate way to share.

GEORGE 2048: I think you'll go for it, Molly 2004. I think you'll go for it, Molly 2004.

MOLLY 2104: You ought to know, George, since you were the first person I did it with. You ought to know, George, since you were the first person I did it with.