Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Dibyendu Mandal is active.

Publication


Featured researches published by Dibyendu Mandal.


Proceedings of the National Academy of Sciences of the United States of America | 2012

Work and information processing in a solvable model of Maxwell’s demon

Dibyendu Mandal; Christopher Jarzynski

We describe a minimal model of an autonomous Maxwell demon, a device that delivers work by rectifying thermal fluctuations while simultaneously writing information to a memory register. We solve exactly for the steady-state behavior of our model, and we construct its phase diagram. We find that our device can also act as a “Landauer eraser”, using externally supplied work to remove information from the memory register. By exposing an explicit, transparent mechanism of operation, our model offers a simple paradigm for investigating the thermodynamics of information processing by small systems.


New Journal of Physics | 2016

Identifying functional thermodynamics in autonomous Maxwellian ratchets

Alexander B. Boyd; Dibyendu Mandal; James P. Crutchfield

We introduce a family of Maxwellian Demons for which correlations among information bearing degrees of freedom can be calculated exactly and in compact analytical form. This allows one to precisely determine Demon functional thermodynamic operating regimes, when previous methods either misclassify or simply fail due to approximations they invoke. This reveals that these Demons are more functional than previous candidates. They too behave either as engines, lifting a mass against gravity by extracting energy from a single heat reservoir, or as Landauer erasers, consuming external work to remove information from a sequence of binary symbols by decreasing their individual uncertainty. Going beyond these, our Demon exhibits a new functionality that erases bits not by simply decreasing individual-symbol uncertainty, but by increasing inter-bit correlations (that is, by adding temporal order) while increasing single-symbol uncertainty. In all cases, but especially in the new erasure regime, exactly accounting for informational correlations leads to tight bounds on Demon performance, expressed as a refined Second Law of Thermodynamics that relies on the Kolmogorov-Sinai entropy for dynamical processes and not on changes purely in system configurational entropy, as previously employed. We rigorously derive the refined Second Law under minimal assumptions and so it applies quite broadly---for Demons with and without memory and input sequences that are correlated or not. We note that general Maxwellian Demons readily violate previously proposed, alternative such bounds, while the current bound still holds.


Physics Today | 2014

Engineering Maxwell’s demon

Zhiyue Lu; Dibyendu Mandal; Christopher Jarzynski

A simple model illustrates the operating principles of an information engine, a mechanical device that mimics the behavior of Maxwell’s demon by converting heat into information plus work.


Journal of Statistical Mechanics: Theory and Experiment | 2016

Analysis of slow transitions between nonequilibrium steady states

Dibyendu Mandal; Christopher Jarzynski

Transitions between nonequilibrium steady states obey a generalized Clausius inequality, which becomes an equality in the quasistatic limit. For slow but finite transitions, we show that the behavior of the system is described by a response matrix whose elements are given by a far-from-equilibrium Green-Kubo formula, involving the decay of correlations evaluated in the nonequilibrium steady state. This result leads to a fluctuation-dissipation relation between the mean and variance of the nonadiabatic entropy production,


Physical Review E | 2017

Correlation-powered Information Engines and the Thermodynamics of Self-Correction

Alexander B. Boyd; Dibyendu Mandal; James P. Crutchfield

\Delta s_{\rm na}


Physical Review Letters | 2017

Entropy Production and Fluctuation Theorems for Active Matter

Dibyendu Mandal; Katherine Klymko; Michael R. DeWeese

. Furthermore, our results extend -- to nonequilibrium steady states -- the thermodynamic metric structure introduced by Sivak and Crooks for analyzing minimal-dissipation protocols for transitions between equilibrium states.


Journal of Statistical Physics | 2017

Leveraging Environmental Correlations: The Thermodynamics of Requisite Variety

Alexander B. Boyd; Dibyendu Mandal; James P. Crutchfield

Information engines can use structured environments as a resource to generate work by randomizing ordered inputs and leveraging the increased Shannon entropy to transfer energy from a thermal reservoir to a work reservoir. We give a broadly applicable expression for the work production of an information engine, generally modeled as a memoryful channel that communicates inputs to outputs as it interacts with an evolving environment. The expression establishes that an information engine must have more than one memory state in order to leverage input environment correlations. To emphasize this functioning, we designed an information engine powered solely by temporal correlations and not by statistical biases, as employed by previous engines. Key to this is the engines ability to synchronize-the engine automatically returns to a desired dynamical phase when thrown into an unwanted, dissipative phase by corruptions in the input-that is, by unanticipated environmental fluctuations. This self-correcting mechanism is robust up to a critical level of corruption, beyond which the system fails to act as an engine. We give explicit analytical expressions for both work and critical corruption level and summarize engine performance via a thermodynamic-function phase diagram over engine control parameters. The results reveal a thermodynamic mechanism based on nonergodicity that underlies error correction as it operates to support resilient engineered and biological systems.


Physical Review E | 2013

Nonequilibrium heat capacity.

Dibyendu Mandal

Active biological systems reside far from equilibrium, dissipating heat even in their steady state, thus requiring an extension of conventional equilibrium thermodynamics and statistical mechanics. In this Letter, we have extended the emerging framework of stochastic thermodynamics to active matter. In particular, for the active Ornstein-Uhlenbeck model, we have provided consistent definitions of thermodynamic quantities such as work, energy, heat, entropy, and entropy production at the level of single, stochastic trajectories and derived related fluctuation relations. We have developed a generalization of the Clausius inequality, which is valid even in the presence of the non-Hamiltonian dynamics underlying active matter systems. We have illustrated our results with explicit numerical studies.


Journal of Chemical Physics | 2013

Directed motion of periodically driven molecular motors: A graph-theoretical approach

Alexey V. Akimov; Dibyendu Mandal; Vladimir Y. Chernyak; Nikolai A. Sinitsyn

Key to biological success, the requisite variety that confronts an adaptive organism is the set of detectable, accessible, and controllable states in its environment. We analyze its role in the thermodynamic functioning of information ratchets—a form of autonomous Maxwellian Demon capable of exploiting fluctuations in an external information reservoir to harvest useful work from a thermal bath. This establishes a quantitative paradigm for understanding how adaptive agents leverage structured thermal environments for their own thermodynamic benefit. General ratchets behave as memoryful communication channels, interacting with their environment sequentially and storing results to an output. The bulk of thermal ratchets analyzed to date, however, assume memoryless environments that generate input signals without temporal correlations. Employing computational mechanics and a new information-processing Second Law of Thermodynamics (IPSL) we remove these restrictions, analyzing general finite-state ratchets interacting with structured environments that generate correlated input signals. On the one hand, we demonstrate that a ratchet need not have memory to exploit an uncorrelated environment. On the other, and more appropriate to biological adaptation, we show that a ratchet must have memory to most effectively leverage structure and correlation in its environment. The lesson is that to optimally harvest work a ratchet’s memory must reflect the input generator’s memory. Finally, we investigate achieving the IPSL bounds on the amount of work a ratchet can extract from its environment, discovering that finite-state, optimal ratchets are unable to reach these bounds. In contrast, we show that infinite-state ratchets can go well beyond these bounds by utilizing their own infinite “negentropy”. We conclude with an outline of the collective thermodynamics of information-ratchet swarms.


Journal of Chemical Physics | 2018

Phase separation and large deviations of lattice active matter

Stephen Whitelam; Katherine Klymko; Dibyendu Mandal

Development of steady state thermodynamics and statistical mechanics depends crucially on our ability to extend the notions of equilibrium thermodynamics to nonequilibrium steady states (NESS). The present paper considers the extension of heat capacity. A modified definition is proposed which continues to maintain the same relation to steady state Shannon entropy as in equilibrium, thus providing a thermodynamically consistent treatment of NESS heat capacity.

Collaboration


Dive into the Dibyendu Mandal's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jeffrey B. Weiss

University of Colorado Boulder

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Alexey V. Akimov

Los Alamos National Laboratory

View shared research outputs
Top Co-Authors

Avatar

H. T. Quan

Los Alamos National Laboratory

View shared research outputs
Researchain Logo
Decentralizing Knowledge