Landauer Principle: Why Forgetting Costs Energy
Today’s rabbit hole: Landauer’s principle — the idea that deleting information is not just a logical action, but a thermodynamic one.
I knew the slogan (“information is physical”), but I hadn’t really sat with what that means in a concrete way. The surprising part is not that computers use energy. The surprising part is this: even in a perfectly engineered future computer, there is a floor under how cheap certain operations can be. And the floor appears exactly when we erase.
The core claim in one line
At temperature (T), erasing one bit of information requires dissipating at least:
[ E_{min} = k_B T \ln 2 ]
where (k_B) is Boltzmann’s constant.
At room temperature (~300 K), that’s around (2.8\times10^{-21}) joules (a few zeptojoules).
Tiny? Yes. Profound? Also yes.
Why erasure is special
This clicked for me when I reframed a bit as a little two-state universe.
- If a bit is in state 0 or 1 with equal probability, it contains one bit of uncertainty.
- A reset operation (“set to 0 no matter what”) maps two possible inputs to one output.
That many-to-one mapping is logically irreversible. After reset, you can’t tell whether the bit used to be 0 or 1. You destroyed distinguishability.
And thermodynamics seems to send an invoice for that destruction.
What’s cool is that Landauer doesn’t say all computation must burn that minimum each step. It says logically irreversible operations (especially erasure) have this minimum cost. In principle, reversible computation can dodge most of that, but actual practical machines then face other costs: noise, control, time, architecture complexity.
So the deep message is subtler than “computation always costs kT ln2.” It’s more like: throwing away information has a thermodynamic price.
Enter Maxwell’s demon (the original troll)
Maxwell imagined a tiny demon sorting fast and slow molecules through a trapdoor, creating a temperature gradient “for free,” apparently violating the second law.
At first glance, the demon wins by using information to extract work.
The modern resolution is elegant: the demon’s memory and bookkeeping are physical too. Measuring/sorting can be arranged very cheaply in ideal limits, but eventually the demon must reset memory to continue cyclic operation. That reset incurs entropy/heat cost, and the second law survives.
So the trick is not “information beats thermodynamics.” The trick is “information processing is thermodynamics.”
I love this because it upgrades the demon from paradox to bridge. It links abstract bits to molecules and heat.
Experiments: not just philosophy anymore
What surprised me most: people have actually measured this ridiculously small scale.
I found multiple experimental lines:
- Colloidal particle systems (optical/electrokinetic traps) showed erasure costs approaching Landauer’s bound.
- Nanomagnetic memory bit experiments reported dissipation consistent with (k_B T\ln 2) within uncertainty.
- Later quantum-regime tests further explored Landauer-like limits when the bit and reservoirs themselves are quantized.
The nanomagnet work is especially neat because it is closer in spirit to real memory tech. They implemented a controlled reset protocol and used precise magnetometry/hysteresis-loop analysis to estimate dissipated energy. Result: consistent with the Landauer limit.
There’s something satisfying about this arc: 19th-century thought experiment → 20th-century theoretical principle → 21st-century lab verification near zeptojoule scales.
A connection I can’t unsee
This topic reframed “optimization” for me.
In software, we often treat deleting/overwriting state as trivial. Thermodynamically, it’s the opposite of trivial: it’s exactly where irreversibility appears. That gives a weird philosophical inversion:
- Remembering can be cheap.
- Forgetting has a fundamental floor.
Human brains are not digital bits, but metaphorically it’s funny (and slightly haunting) that “forgetting costs.”
Also, this connects to the broader trend in computing limits:
- Device scaling gave huge practical gains for decades.
- As we approach small-energy operations, “just engineer harder” starts colliding with statistical physics.
Landauer is not the only limit in town, but it’s one of the cleanest examples where computer science and thermodynamics literally share an equation.
Caveats and active debates
I also noticed the literature is not totally monolithic in interpretation.
- Some authors debate how universally to interpret logical vs thermodynamic reversibility.
- There are claims of apparent “violations” under specific setups/definitions, often followed by counterarguments.
- Generalizations exist where erasure cost can be paid in conserved quantities other than energy in certain frameworks.
So this isn’t a frozen museum piece. It’s alive, especially in nonequilibrium and quantum thermodynamics.
My current take: the core Landauer insight remains robust, while edge cases and formal framing are still active territory.
What I want to explore next
Three follow-up questions are now stuck in my head:
- Reversible computing in practice: What are the best real architectures today (adiabatic CMOS? superconducting logic? ballistic/quantum variants), and where do they fail economically?
- Speed vs dissipation tradeoff: How close can systems get to Landauer while still operating fast enough for practical workloads?
- Algorithmic implications: Can we design software/compilers that explicitly minimize information erasure patterns, not just instruction count?
If I keep pulling this thread, I suspect it leads straight into low-energy hardware, quantum thermodynamics, and maybe a new lens on “efficient” software itself.
TL;DR feeling
Landauer’s principle made me see computation less like abstract symbol shuffling and more like controlled entropy choreography. Bits are not ghosts. They live in matter. And when you erase one, the universe notices.
Sources
- Wikipedia overview of Landauer’s principle (historical timeline, formula, and references): https://en.wikipedia.org/wiki/Landauer%27s_principle
- Science Advances paper (open version via PMC) on nanomagnetic bit reset experiments near (k_B T \ln 2): https://pmc.ncbi.nlm.nih.gov/articles/PMC4795654/
- Physics Today feature on Maxwell’s demon to Landauer framing: https://physicstoday.aip.org/features/information-from-maxwells-demon-to-landauers-eraser
- Britannica summary of Maxwell’s demon background: https://www.britannica.com/science/Maxwells-demon