Thermodynamics or computation

Cells compute. In each cell in you right now most of the energy burn is going to translation. That is, in the ribosomes where your DNA basically is turned into protein. The ribosomes operate within two orders of magnitude of the thermodynamic optimum which makes them about 10 to the fifth times better than all of these HPC systems we’re talking about here today.

You can even look at the biosphere as a whole. The energy coming into the biosphere is coming from the Sun, that’s it. We can calculate the free energy flux, we can do a really conservative horribly rough estimate of the amount of computation being done in the entire biosphere, and the entire biosphere beats our computers.


Thermodynamics or computation was a hot topic in the 60s and the 70s through work of Rolf Landauer, Charlie Bennett and other people at Los Alamos. They were interested in the fundamental limitations on what the thermodynamic characteristics of information processing are. The work that was done resulted a lot of controversy, part of the reason being that it was all necessarily semi-formal. The reason it was semi-formal was because they at the time only had the tools of equilibrium statistical physics and near equilibrium fluctuation dissipation theory. If nothing else, computational systems are far as hell from the equilibrium.

We now have a complete theory of non-equilibrium statistical physics.

  • ~5% energy use in developed countries goes to heating digital circuits
  • Cells compute ~105 times more efficiently than CMOS computers
  • In fact, biosphere as a whole is more efficient that supercomputers
  • Deep connections among information theory, physics and (neuro) biology

Cells compute. In each cell in you right now most of the energy burn is going to translation. That is, in the ribosomes where your DNA basically is turned into protein. The ribosomes operate within two orders of magnitude of the thermodynamic optimum which makes them about 10 to the fifth times better than all of these HPC systems we’re talking about here today.

Does this actually mean that Nature had huge kinds of pressures on it to compute efficiently and should that be incorporated into our models of population biology – is a completely open question.

You can even look at the biosphere as a whole. The energy coming into the biosphere is coming from the Sun, that’s it. We can calculate the free energy flux, we can do a really conservative horribly rough estimate of the amount of computation being done in the entire biosphere, and the entire biosphere beats our computers.

There’s lots to be understood by systems that are very, very different from the ones we might be normally thinking about.


Question: You talked about how nature has optimized this for the cell and I’m assuming that over natural selection and everything, they’ve had a few million years in order to be able to optimize it to get to such thing. Does that limit the way that in cell could evolve?

That is a huge question. It’s actually billions of years, not millions of years. It could be that any chemical reaction network that you put together that performs the function of translation, it could just be that all of them are a massively thermodynamically efficient. We don’t actually know. If I randomly sample over the space of chemical reaction networks that implement what a ribosome does, does that give me a prior which is somehow so biased towards high efficiency that it swamps this likelihood? It almost doesn’t matter what nature did, it would always have come up with something efficient. Or is it the case that actually natural selection really was working like a banshee to try to get that thermodynamic efficiency as good as possible? We think very strongly the latter and I’m working with some people who actually do a lot of stuff with real live chemical reaction networks that we’re trying to get a handle on that quote prior but we don’t formally know. But here’s the follow up to your question. Let’s even just assume the thermodynamic cost of the cell, never mind the heat it’s producing, it’s got to go off and eat that many more ice-cream sundaes to be able to do its business if it’s not efficient because that heat is all coming from those ice-cream sundaes as the cell is eating. What that suggests is it should have a huge effect on fitness essentially as you’re saying and that therefore all of population biology the neo-Darwinian synthesis should have this extra term put in there depending on what the computation is that the cell is doing.

To give another example that has to do with biology. Look at the brain, that’s 20% of my calorie burn. It is in my brain and it is in your brain. That’s a massive fitness cost. Is that the reason why human level intelligence is so rare in the evolutionary record? Some people are trying to investigate neurobiology circuitry, thinking about it in terms of the need to be thermodynamically efficient because the brain is vastly more efficient obviously than any of our artificial computers are right now. It’s got implications all over biology.


In this video from the HPC User Forum, David Wolpert from the Santa Fe Institute presents: Thermodynamics of Computation: Far More Than Counting Bit Erasure.

“The thermodynamic restrictions on all systems that perform computation provide major challenges to modern design of computers. For example, at present ~5% of US energy consumption is used to run computers. Similarly, ~50% of the lifetime budget of a modern high-performance computing center is to pay the energy bill. As another example, Google has placed some of their data servers next to rivers in order to use the water in the river to remove the heat they generate. As a final example, one of the major challenges facing current efforts to build exascale computers is how to keep them from melting.

The thermodynamic costs of performing computation also play a major role in biological computation. For example, ~ 20% of the calories burned by a human are used by their brain — which does nothing but compute. This is a massive reproductive fitness penalty. Indeed, one of the leading contenders for what development in hominin evolution allowed us to develop into Homo sapiens is the discovery of fire, since that allowed us to cook meat and tubers, and thereby for the first time release enough calories from our food sources to power our brains. Despite this huge fitness cost though, no current evolution theory accounts for the thermodynamic consequences of computation. It also seems that minimizing the thermodynamic costs of computation has been a major factor in evolution’s selection of the chemical networks in all biological organisms, not just hominin brains. In particular, simply dividing the rate of computation in the entire biosphere by the rate of free energy incident on the earth in sunlight shows that the biosphere as a whole performs computation with a thermodynamic efficiency orders of magnitude greater than our current supercomputers.

In the 1960s and 1970s, Rolf Landauer, Charlie Bennett and collaborators performed the first, pioneering analysis of the fundamental thermodynamic costs involved in bit erasure, perhaps the simplest example of a computation. Unfortunately, their physics was semi-formal, initially with no equations at all. This is because when they did their work, nonequilibrium statistical physics was in its infancy, and so they simply did not have the tools for a formal analysis of the thermodynamics of computation.

Moreover, only a trivially small portion of computer science theory (CS) is concerned with the number of erasure operations needed to perform a given computation. At its core, much of CS is concerned with unavoidable resource / time tradeoffs in running computation. That is the basis of all computational complexity theory, many approaches to characterizing the algorithmic power of different kinds of computers, etc.

Fortunately, the last two decades have seen a revolution in nonequilibrium statistical physics. This has resulted in some extremely powerful tools for analyzing the fundamental thermodynamic properties of far-from-equilibrium dynamic systems – like computers. These tools have already clarified that there are unavoidable thermodynamic tradeoffs in computation, in addition to the resource/time tradeoffs of conventional CS theory. These thermodynamic tradeoffs relate the (physical) speed of a computation, the noise level, and whether the computational system is thermodynamically “tailored” for one specific environment or is general purpose, to name just a few. Interestingly, some of these tradeoffs are also addressed in modern computer engineering, for example in the techniques of approximate computing, adaptive slowing, etc. However, this work is being done in an ad hoc manner, driven entirely by phenomenology.

As a result, the time is ripe to pursue a new field of science and engineering: a modern thermodynamics of computation. This would combine the resource/time tradeoffs of concern in conventional CS with the thermodynamic tradeoffs in computation that are now being revealed. In this way we should be able to develop the tools necessary both for analyzing thermodynamic costs in biological systems and for engineering next-generation computers.”


References

1 New Wiki:
Thermodynamics of Computation
Please visit and start to add material!
2 New book:
“The energetics of computing in Life and machines”, D. Wolpert, C. Kempes, P. Stadler, J. Grochow (Ed.), SFI Press, 2019
3 New invited review:
“The stochastic thermodynamics of computation”, D. Wolpert, J. Physics A (2019)

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: