How cheaply can you run a computer? How much energy does it require to run a computation? The big fans in last week's Theory Thursday suggest: a lot. But, in the 60s, when everybody else was trying to figure out how quickly we could compute, Rolf Landauer was at Bell Labs trying to figure out how slowly you could compute. It had been assumed, up until that point, that computation cost energy: if you want to take a bunch of numbers, and figure something out, you had to put in some energy to get an answer out. And, that energy was lost to entropy (i.e. heat). Those fans in last week's Theory Thursday were carrying all of that heat away. From an information theory standpoint, imagine a basic logic circuit, such as an AND gate:

The gate takes in two bits (A and B) and has a single output (O). That means that we lose a bit of information in this transformation, which has to be released as entropy (based on last week's energy-information equivalence.) But, what if you could perform computations using only gates that looked like this:

These gates all take two input bits, and produce two output bits. Landauer and others not only demonstrated that such gates could be used to do Turing complete computation, they envisioned a hypothetical "pinball machine" model of computing: bits are represented by spheres, and gate operations are carried out by arrangements of walls and elastic collisions between spheres:

Since the collisions are assumed to be completely elastic, the gates don't use any energy, and the computation is lossless.

Even more interesting, Landauer later showed that even with lossy computing (as with two-to-one bit gates), it was possible to carry out computations with input energies approaching zero, as long as you're willing to wait long enough for the answer. This is what is called "reversible computing," since it depends on reversible reactions, in the thermodynamic sense. Without getting too deep into the thermodynamics, this is somewhat analogous to running a marathon: the faster you want to get there, the more total energy you have to expend, because humans aren't "reversible". It takes more energy to run from point A to point B than to walk there. Landauer hypothesized about theoretical molecular computers that depended on reactions that could be biased from reversibility towards irreversibility by applying more energy, making them run faster, at higher expense, or, conversely, running at arbitrarily low energy, for arbitrarily longer compute times.

And that's your Theory Thursday for this week!

The gate takes in two bits (A and B) and has a single output (O). That means that we lose a bit of information in this transformation, which has to be released as entropy (based on last week's energy-information equivalence.) But, what if you could perform computations using only gates that looked like this:

These gates all take two input bits, and produce two output bits. Landauer and others not only demonstrated that such gates could be used to do Turing complete computation, they envisioned a hypothetical "pinball machine" model of computing: bits are represented by spheres, and gate operations are carried out by arrangements of walls and elastic collisions between spheres:

Since the collisions are assumed to be completely elastic, the gates don't use any energy, and the computation is lossless.

Even more interesting, Landauer later showed that even with lossy computing (as with two-to-one bit gates), it was possible to carry out computations with input energies approaching zero, as long as you're willing to wait long enough for the answer. This is what is called "reversible computing," since it depends on reversible reactions, in the thermodynamic sense. Without getting too deep into the thermodynamics, this is somewhat analogous to running a marathon: the faster you want to get there, the more total energy you have to expend, because humans aren't "reversible". It takes more energy to run from point A to point B than to walk there. Landauer hypothesized about theoretical molecular computers that depended on reactions that could be biased from reversibility towards irreversibility by applying more energy, making them run faster, at higher expense, or, conversely, running at arbitrarily low energy, for arbitrarily longer compute times.

And that's your Theory Thursday for this week!

## No comments:

## Post a Comment