The usual extropian retort to this limit has been "reversible computing," but as far as I can tell there has been little work on reversible computing hardware -- far less than quantum computing. It looks like the University of Florida made some hardware in the early 2000s. Can anyone offer insights about why hardware research is so scarce here? Are there reasons to think it's a dead end prior to fabricating anything?
EDIT: searching Google Scholar, it looks like only ~50 papers/books mention reversible computing in the past year, compared to 1000+ for quantum computing.
Just my guess, but the practical motivation is probably not there -- we are so far from the levels of efficiency at which the Landauer principle is noticeable.
Oh it's worth noting that quantum computing research is reversible computing research. Quantum operations are unitary operators which necessarily have inverses. Famous quantum computing results like the "no cloning" theorem are basically just consequences of the reversibility of QC.
I suspect any non-trivial reversible computer will be a quantum computer. Quantum gates can perform almost any reversible operation and losing information (by not being reversible) might be physically equivalent to decoherence.
Reversible computation is inherently slow computation and the main advantage is it lets you get below this limit but current computers are still many (dozens?) of orders of magnitude off from reaching that limit. Until then, you'd be better off just improving efficiency of non-reversible computers.
That wiki page says "Although the Landauer limit was millions of times below the energy consumption of computers in the 2000s and thousands of times less in the 2010s,[3] proponents of reversible computing argue that this can be attributed largely to architectural overheads which effectively magnify the impact of Landauer's limit in practical circuit designs, so that it may prove difficult for practical technology to progress very far beyond current levels of energy efficiency if reversible computing principles are not used.[4]"
On a skim it looks like [4]'s argument is that people have been criticizing the overhead of fully reversible circuits, but only 'conditionally reversible' circuits are needed for the power benefits.
Based on this statement there is also another possible way: remove the architectural overheads that magnify the impact of Launder's limit. So probably he is right saying that there are other improvements that are easier to accomplish that we can do before caring about Launder's limit.
Adding: I looked it up and Feynman quotes 10^8 times as much energy dissipated per step as the Van Neumann-Landauer limit when using the transistors of his era (1984-86 is when he taught the course).
I remember talking to someone who had worked on reversible computing about 13 or 14 years ago, and he had said that (i) it does work (presumably that it is indeed more energetically efficient), but (ii) the state of computing was such that it may be a couple decades before it can make a real impact (). Then he added, "someone else can have that fun." Around that time he switched to working on other things. () Unfortunately I can no longer recall exactly what he meant.
To make a circuit reversible generally requires more gates, because you need a place to store the information that you normally might have gotten rid of (say in an AND gate you simply store one of the variables). Practically this would make chips much slower if you assume everything else in fabrication stays the same (e.g., die size). Not a whole lot of benefits it seems, except for energy consumption, but that's the consumer's problem.
Reversible computing still requires energy (or rather neg-entropy). It's used to perform error correction. Otherwise the computer won't last very long.
I picked up The Information by James Gleick when I was visiting the Living Computers Museum + Labs in Seattle[1]. Really interesting book... talks about the development of language, the birth of computing, the concept of randomness, the physics of information, etc. A fascinating and comprehensive overview of the topic.
Basically, it costs free energy to forget a bit. If you flip a bit, that doesn't mean you forget it, as long as you remember you'll have to flip it back (or reinterpret it).
("Basically" because there are whole books on the subtleties of physics and information. I haven't mastered them.)
This is the correct answer. If you don't overwrite a bit, but instead move the old value to an auxiliary bit known to be zero, that can be theoretically done for free.
From what I understand flipping a bit isn't necessarily subject to this bound. However setting it to either 1 or 0 is.
Of course if you can flip it without expending energy then reading it and flipping it if it's 1 necessarily requires the Landauer limit's worth of energy.
Well for example, T cannot be less than the temperature of the cosmic microwave background. You can make regions with lower T, but only by pumping heat out in some way, which is more energy to do.
Being on mobile and not able to explore in depth, that quote sounds like a variant of Maxwell’s demon. It is correct to say that the Landauer limit is not due to a single physical law that must hold true, but rather a lack of knowledge about the state of the universe and the fact that acquiring that knowledge to do a “free” bitflip requires at least equivalent energy expenditure as that bitflip. TANSTAAFL.
As interesting as it is to think about the limit, it's so pointless from a practical point of view since the cost of computation is dwarfed by the cost of data movement in modern machines.
Yes. I imagine the last probe doing its last bit flip in the heat death of the universe. Then nothing can be computed anymore and all what will happen is the decay of the matter of the probe. So sad.
https://en.wikipedia.org/wiki/Reversible_computing
EDIT: searching Google Scholar, it looks like only ~50 papers/books mention reversible computing in the past year, compared to 1000+ for quantum computing.