The behavior of biological computing machinery typically cannot fully be understood by equilibrium thermodynamics. These systems are better described as driven systems far from thermodynamic equilibrium. In recent years significant progress has been made in this area, most notably Jarzynski’s work relation and Crooks’ fluctuation theorem. I will give some minimal relevant background in this area, and also in information theory and machine learning, before deriving the main results:
1) instantaneous non-predictive information is proportional to the work dissipated due to a change in the driving signal;
2) summed over the length of a driving protocol, non-predictive information provides a lower bound on the total average dissipated work;
3) a refinement to Landauer’s principle can be obtained in which the lower bound on the heat generated due to the erasure of information is augmented by non-predictive information.
These results highlight a profound connection between the effective use of information and efficient thermodynamic operation: any system constructed to keep memory about its environment and to operate with maximal energetic efficiency has to be predictive. The general belief that any decent model must be predictive while having limited model complexity may therefore have a very physical underpinning: systems that implement this predictive inference successfully are thermodynamically efficient. The results hold arbitrarily far from thermodynamic equilibrium and are applicable to a wide range of systems, including bio-molecular machines and also artificial computing machinery. The paper will appear in PRL and is available as a preprint at http://arxiv.org/abs/1203.3271/
Thursday, August 30, 2012