Abstract
G & aacute;cs's coarse-grained algorithmic entropy leverages universal computation to quantify the information content of any given physical state. Unlike the Boltzmann and Gibbs-Shannon entropies, it requires no prior commitment to macrovariables or probabilistic ensembles, rendering it applicable to settings arbitrarily far from equilibrium. For measure-preserving dynamical systems equipped with a Markovian coarse graining, we prove a number of fluctuation inequalities. These include algorithmic versions of Jarzynski's equality, Landauer's principle, and the second law of thermodynamics. In general, the algorithmic entropy determines a system's actual capacity to do work from an individual state, whereas the Gibbs-Shannon entropy gives only the mean capacity to do work from a state ensemble that is known a priori.
| Original language | English |
|---|---|
| Article number | 014118 |
| Pages (from-to) | 1-24 |
| Number of pages | 24 |
| Journal | Physical Review E |
| Volume | 111 |
| Issue number | 1 |
| DOIs | |
| Publication status | Published - 8 Jan 2025 |