Featured Researches

Disordered Systems And Neural Networks

Ising spin glass in a random network with a gaussian random field

We investigate thermodynamic phase transitions of the joint presence of spin glass (SG) and random field (RF) using a random graph model that allows us to deal with the quenched disorder. Therefore, the connectivity becomes a controllable parameter in our theory, allowing us to answer what the differences are between this description and the mean-field theory i.e., the fully connected theory. We have considered the random network random field Ising model where the spin exchange interaction as well as the RF are random variables following a Gaussian distribution. The results were found within the replica symmetric (RS) approximation, whose stability is obtained using the two-replica method. This also puts our work in the context of a broader discussion, which is the RS stability as a function of the connectivity. In particular, our results show that for small connectivity there is a region at zero temperature where the RS solution remains stable above a given value of the magnetic field no matter the strength of RF. Consequently, our results show important differences with the crossover between the RF and SG regimes predicted by the fully connected theory.

Read more
Disordered Systems And Neural Networks

Isotope effects in x-ray absorption spectra of liquid water

The isotope effects in x-ray absorption spectra of liquid water are studied by a many-body approach within electron-hole excitation theory. The molecular structures of both light and heavy water are modeled by path-integral molecular dynamics based on the advanced deep-learning technique. The neural network is trained on ab initio data obtained with SCAN density functional theory. The experimentally observed isotope effect in x-ray absorption spectra is reproduced semiquantitatively in theory. Compared to the spectrum in normal water, the blueshifted and less pronounced pre- and main-edge in heavy water reflect that the heavy water is more structured at short- and intermediate-range of the hydrogen-bond network. In contrast, the isotope effect on the spectrum is negligible at post-edge, which is consistent with the identical long-range ordering in both liquids as observed in the diffraction experiment.

Read more
Disordered Systems And Neural Networks

Kauzmann's paradox

The rapid structural and vibrational entropy decrease with decreasing temperature in undercooled liquids is explained in terms of the disappearance of local structural instabilities, which freeze in at the glass temperature as boson peak modes and low temperature tunneling states. At the Kauzmann temperature, their density extrapolates to zero.

Read more
Disordered Systems And Neural Networks

L 2 localization landscape for highly-excited states

The localization landscape gives direct access to the localization of bottom-of-band eigenstates in non-interacting disordered systems. We generalize this approach to eigenstates at arbitrary energies in systems with or without internal degrees of freedom by introducing a modified L 2 -landscape, and we demonstrate its accuracy in a variety of archetypal models of Anderson localization in one and two dimensions. This L 2 -landscape function can be efficiently computed using hierarchical methods that allow evaluating the diagonal of a well-chosen Green function. We compare our approach to other landscape methods, bringing new insights on their strengths and limitations. Our approach is general and can in principle be applied to both studies of topological Anderson transitions and many-body localization.

Read more
Disordered Systems And Neural Networks

Large Deviation Approach to Random Recurrent Neuronal Networks: Parameter Inference, Activity Prediction, and Fluctuation-Induced Transitions

We here unify the field theoretical approach to neuronal networks with large deviation theory. For a prototypical random recurrent network model with continuous-valued units, we show that the effective action is identical to the rate function and derive the latter using field theory. This rate function takes the form of a Kullback-Leibler divergence which enables data-driven inference of model parameters, Bayesian prediction of time series, and calculation of fluctuations beyond mean--field theory. Lastly, we expose a regime with fluctuation--induced transitions between mean--field solutions.

Read more
Disordered Systems And Neural Networks

Large deviations of the Lyapunov exponent in 2D matrix Langevin dynamics with applications to one-dimensional Anderson Localization models

For the 2D matrix Langevin dynamics that corresponds to the continuous-time limit of the product of some 2×2 random matrices, the finite-time Lyapunov exponent can be written as an additive functional of the associated Riccati process submitted to some Langevin dynamics on the infinite periodic ring. Its large deviations properties can be thus analyzed from two points of view that are equivalent in the end by consistency but give different perspectives. In the first approach, one starts from the large deviations at level 2.5 for the joint probability of the empirical density and of the empirical current of the Riccati process and one performs the appropriate Euler-Lagrange optimization in order to compute the cumulant generating function of the Lyapunov exponent. In the second approach, this cumulant generating function is obtained from the spectral analysis of the appropriate tilted Fokker-Planck operator. The associated conditioned process obtained via the generalization of Doob's h-transform allows to clarify the equivalence with the first approach. Finally, applications to one-dimensional Anderson Localization models are described in order to obtain explicitly the first cumulants of the finite-size Lyapunov exponent.

Read more
Disordered Systems And Neural Networks

Large systems of random linear equations with non-negative solutions: Characterizing the solvable and unsolvable phase

Large systems of linear equations are ubiquitous in science. Quite often, e.g. when considering population dynamics or chemical networks, the solutions must be non-negative. Recently, it has been shown that large systems of random linear equations exhibit a sharp transition from a phase, where a non-negative solution exists with probability one, to one where typically no such solution may be found. The critical line separating the two phases was determined by combining Farkas' lemma with the replica method. Here, we show that the same methods remain viable to characterize the two phases away from criticality. To this end we analytically determine the residual norm of the system in the unsolvable phase and a suitable measure of robustness of solutions in the solvable one. Our results are in very good agreement with numerical simulations.

Read more
Disordered Systems And Neural Networks

Learning DFT

We present an extension of reverse engineered Kohn-Sham potentials from a density matrix renormalization group calculation towards the construction of a density functional theory functional via deep learning. Instead of applying machine learning to the energy functional itself, we apply these techniques to the Kohn-Sham potentials. To this end we develop a scheme to train a neural network to represent the mapping from local densities to Kohn-Sham potentials. Finally, we use the neural network to up-scale the simulation to larger system sizes.

Read more
Disordered Systems And Neural Networks

Learning Molecular Dynamics with Simple Language Model built upon Long Short-Term Memory Neural Network

Recurrent neural networks (RNNs) have led to breakthroughs in natural language processing and speech recognition, wherein hundreds of millions of people use such tools on a daily basis through smartphones, email servers and other avenues. In this work, we show such RNNs, specifically Long Short-Term Memory (LSTM) neural networks can also be applied to capturing the temporal evolution of typical trajectories arising in chemical and biological physics. Specifically, we use a character-level language model based on LSTM. This learns a probabilistic model from 1-dimensional stochastic trajectories generated from molecular dynamics simulations of a higher dimensional system. We show that the model can not only capture the Boltzmann statistics of the system but it also reproduce kinetics at a large spectrum of timescales. We demonstrate how the embedding layer, introduced originally for representing the contextual meaning of words or characters, exhibits here a nontrivial connectivity between different metastable states in the underlying physical system. We demonstrate the reliability of our model and interpretations through different benchmark systems and a single molecule force spectroscopy trajectory for multi-state riboswitch. We anticipate that our work represents a stepping stone in the understanding and use of RNNs for modeling and predicting dynamics of complex stochastic molecular systems.

Read more
Disordered Systems And Neural Networks

Learning What a Machine Learns in a Many-Body Localization Transition

We employ a convolutional neural network to explore the distinct phases in random spin systems with the aim to understand the specific features that the neural network chooses to identify the phases. With the energy spectrum normalized to the bandwidth as the input data, we demonstrate that a network of the smallest nontrivial kernel width selects level spacing as the signature to distinguish the many-body localized phase from the thermal phase. We also study the performance of the neural network with an increased kernel width, based on which we find an alternative diagnostic to detect phases from the raw energy spectrum of such a disordered interacting system.

Read more

Ready to get started?

Join us today