Universität Stuttgart
Permanent URI for this communityhttps://elib.uni-stuttgart.de/handle/11682/1
Browse
2 results
Search Results
Item Open Access Nonequilibrium sensing and its analogy to kinetic proofreading(2015) Hartich, David; Barato, Andre C.; Seifert, UdoFor a paradigmatic model of chemotaxis, we analyze the effect of how a nonzero affinity driving receptors out of equilibrium affects sensitivity. This affinity arises whenever changes in receptor activity involve adenosine triphosphate hydrolysis. The sensitivity integrated over a ligand concentration range is shown to be enhanced by the affinity, providing a measure of how much energy consumption improves sensing. With this integrated sensitivity we can establish an intriguing analogy between sensing with nonequilibrium receptors and kinetic proofreading: the increase in integrated sensitivity is equivalent to the decrease of the error in kinetic proofreading. The influence of the occupancy of the receptor on the phosphorylation and dephosphorylation reaction rates is shown to be crucial for the relation between integrated sensitivity and affinity. This influence can even lead to a regime where a nonzero affinity decreases the integrated sensitivity, which corresponds to anti-proofreading.Item Open Access Stochastic thermodynamics of information processing: bipartite systems with feedback, signal inference and information storage(2017) Hartich, David; Seifert, Udo (Prof. Dr.)Stochastic thermodynamics is a theoretical framework that extends the laws of classical thermodynamics to small system at the molecular and cellular scale. In particular processing information at theses scales is continuously corrupted by thermal fluctuations. Examples involve translating information from DNA to proteins, bacteria that sense their environment or neurons that fire action potentials. In all of these examples, energy is consumed to process information or to shield the process against thermal fluctuations. This thesis investigates the relation between information and thermodynamics in physical systems. We develop a framework for two continuously coupled systems, which is called stochastic thermodynamics of bipartite systems. This framework includes information and refines the standard second law of thermodynamics. In the first part we consider feedback-driven engines, where one subsystem is controlled by a second subsystem that constitutes the feedback controller. The feedback controller continuously acquires information about the controlled subsystem and uses it to rectify thermal fluctuations, i.e., to "convert information into energy". We compare two information theoretic quantities that characterize the performance of the feedback controller the transfer entropy rate and the learning rate. We find that only the latter both (i) bounds the rate of energy extraction from the medium due to the controlled subsystem and (ii) is itself bounded by the thermodynamic cost to maintain the dynamics of the feedback controller. This insight is one of the main results and provides a modern view on classical thought experiments first proposed by Maxwell. In the second part, we discuss implications to cellular information processing, whereby a stochastic time dependent signal is measured by a sensory network. In contrast to feedback-driven engines, here a sensor dissipates energy to acquire information about a signal, i.e., "it converts energy into information". We define an efficiency that relates the information which a sensor acquires to the energy which is dissipated by the sensor. Models that are inspired by the sensory system of Escherichia coli chemotaxis are used to illustrate our findings. Moreover, a purely information theoretic quantity, which is called sensory capacity, is introduced. The sensory capacity is bounded by one and given by the ratio of the learning rate of the sensor and the transfer entropy rate from the signal to the sensor. The sensory capacity is maximal if the instantaneous state of the sensor knows as much about the signal as its full time history. We show that the sensory capacity can be increased with an additional dissipative memory, where the increase of the sensory capacity characterizes the performance of the memory. A general tradeoff between the sensory capacity and the efficiency is shown, which demonstrates that a sensor cannot be both: a perfect noise filter and energetically efficient. The third subject considers binary sensors (e.g., receptors) measuring a stochastic signal (e.g., ligand concentration). For this setup we study the information loss of inference strategies that are solely based on time-averages of the sensor state. We show that simple time-averaging strategies lose up to 0.5 bit of information compared with the full time history of the sensor. This result holds for an arbitrary number of sensors measuring the same signal independently. Furthermore, we show that the same information loss occurs if one approximates a discrete chemical master equation by a continuous Brownian motion. In the last part, we discuss nonequilibrium receptors that are driven out of equilibrium by an ATP hydrolysis reaction. It is shown that the sensitivity of the receptor to concentration changes can be increased with the nonequilibrium reaction, whereby the increase in sensitivity is related to the chemical energy released in the hydrolysis of one ATP molecule. It turns out that there is an analogy between nonequilibrium receptors and kinetic proofreading, which is a dissipative mechanism to reduce errors in a polymerization process. This part demonstrates that investing chemical energy can improve the capability to process information.