Probing Criticality in Quantum Spin Chains with Neural Networks
PProbing criticality in quantum spin chains withneural networks
A Berezutskii , , , M Beketov , D Yudin , Z Zimbor´as , , andJ D Biamonte Moscow Institute of Physics and Technology, Dolgoprudny, Moscow Region 141700,Russia Deep Quantum Laboratory, Skolkovo Institute of Science and Technology, Moscow121205, Russia Wigner Research Centre for Physics, Theoretical Physics Department, BudapestH-1525, Hungary MTA-BME Lend¨ulet Quantum Information Theory Research Group, Budapest,Hungary Mathematical Institute, Budapest University of Technology and Economics,Budapest, HungaryE-mail: [email protected]
Abstract.
The numerical emulation of quantum systems often requires anexponential number of degrees of freedom which translates to a computationalbottleneck. Methods of machine learning have been used in adjacent fields for effectivefeature extraction and dimensionality reduction of high-dimensional datasets. Recentstudies have revealed that neural networks are further suitable for the determinationof macroscopic phases of matter and associated phase transitions as well as efficientquantum state representation. In this work, we address quantum phase transitionsin quantum spin chains, namely the transverse field Ising chain and the anisotropicXY chain, and show that even neural networks with no hidden layers can be effectivelytrained to distinguish between magnetically ordered and disordered phases. Our neuralnetwork acts to predict the corresponding crossovers finite-size systems undergo. Ourresults extend to a wide class of interacting quantum many-body systems and illustratethe wide applicability of neural networks to many-body quantum physics.
Submitted to:
Journal of Physics: Complexity a r X i v : . [ c ond - m a t . d i s - nn ] M a y robing criticality in quantum spin chains with neural networks
1. Introduction
The concept of deep learning [1] has attracted dramatic interest over the last decade.First applied in the domain of image and natural speech recognition, algorithms formachine learning have recently shown their utility in statistical mechanics of interactingclassical and quantum systems [2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17]..Solving a quantum many-body problem often implies a coarse-graining procedureto remove redundant degrees of freedom from the short-range, or the high-energy,sector of the theory. In this case, a proper elucidation of low energy properties ofthe system or the type of its long-range ordering encodes the macroscopic behavior. Inits turn, the methodology of machine learning in multidimensional and typically non-structured datasets is inevitably linked to the effective approaches to dimensionalityreduction, thereby yielding a powerful technique for the detailed analysis of classicaland quantum models in many-body physics [18, 19]. Practical application of neuralnetworks in the context of both supervised and unsupervised machine learning has nowbecome commonplace for testing thermal, quantum, and topological phase transitions[2, 3, 4, 5, 6, 7, 8, 9, 10, 11] as well as for formulating effective variational wave functionans¨atze states [12, 13, 14, 15, 16, 17]. The application of machine learning to quantum-information problems has also received significant interest recently, promising to directlyprobe the entanglement entropy [20, 21, 22] as well as other properties. The utilityof machine learning methods for quantum information purposes is driven by its greatsuccess in condensed matter physics [23, 24, 25, 26, 27, 28, 5, 29, 30, 31, 32, 33, 34, 35, 36]and computational many-body methods [37, 38, 39, 40, 41, 42]. In this study, weemploy a specific machine learning technique to create a low-dimensional representationof microscopic states, relevant for macroscopic phase identification and probing phasetransitions. More specifically, we explore phase transitions in the transverse field Isingchain and anisotropic XY model and demonstrate that even the simplest possible neuralnetwork architecture—a binary classifier as a perceptron with no hidden neurons presentis capable of keeping track of its macroscopic phases depending on the, e.g., externalmagnetic field or anisotropy parameter, without any prior knowledge.
2. Model systems
One-dimensional spin models represent strongly correlated quantum systems that canbe rigorously approached at equilibrium [43]. Certain non-equilibrium properties canalso be extracted [44]. In the following, we focus on the one-dimensional ferromagnetictransverse field Ising model (TFIM). The TFIM naturally appears upon solvinga classical two-dimensional Ising model with ferromagnetic-type nearest-neighborexchange coupling and its exact solution dates back to the original works [45, 46, 47].Generally, the TFIM of L spins on a chain with open boundary conditions is specified robing criticality in quantum spin chains with neural networks H = − J L − (cid:88) i =1 σ zi σ zi +1 − τ L (cid:88) i =1 σ xi , (1)which represents a 2 L × L matrix with σ αi ( α = x, y, z ) being a Pauli matrix acting onsite i , and J and τ stand for the strength of exchange coupling and external magneticfield respectively. Interestingly, despite its relative simplicity, this model was used todescribe intricate physics, e.g., the order-disorder transitions in ferroelectric crystals ofKH PO . At zero temperature, quantum fluctuations may lead to a restructuring ofthe ground state which is manifested by a certain non-analyticity in the ground stateenergy of the quantum Hamiltonian. For the case of the Hamiltonian (1), when there isno magnetic field present ( τ = 0) the ground state configuration is purely determined bythe exchange interaction, the first term in Equation (1), which favors collinear magneticordering. For J >
0, the ferromagnetic state is energetically preferable, meaning that allmagnetic moments point in the same direction (cid:104) σ zi (cid:105) = +1 (or − τ = τ c makes the system susceptible to spin flip and all the spins aligned in x directionin the limit τ → ∞ , i.e., disordered in σ z basis.The one-dimensional TFIM can be worked out analytically by virtue of theJordan-Wigner transformation that maps an interacting spin model onto that of freespin-polarized fermions [47, 48]. The exact solution unambiguously demonstrates acontinuous quantum phase transition (QPT) upon passing through the critical field τ c = 1 (in the units of J ), separating magnetically ordered ferromagnet ( τ < τ c )and disordered paramagnetic states ( τ > τ c ). Although there is no exact analyticalsolution in higher dimensional systems, a quantum phase transition can be clearlydetected [48]. It is worth noting that the phase diagram of a one-dimensional TFIMis very similar to that of a two-dimensional classical Ising model at finite temperaturewith a temperature-driven phase transition. Interestingly, this dualism has a strictmathematical form corresponding to the so-called Suzuki-Trotter decomposition andwhich maps a d -dimensional quantum model to a d + 1 dimensional classical one [49]. The XY model is yet another well-known quantum spin lattice model of magnetism.One can arrive to the isotropic version of this model by switching off the ZZ couplingsin the Heisenberg Hamiltonian. In its turn, the anisotropic XY model is a generalizationof it in the sense that the interaction strength in the XY plane is not isotropic anymore.In this study, we limit ourselves to the case when there is no field transverse to theinteraction plane. The Hamiltonian of the model is thus given by H = − J L − (cid:88) i =1 (cid:18) γ σ xi σ xi +1 + 1 − γ σ yi σ yi +1 (cid:19) , (2) robing criticality in quantum spin chains with neural networks γ is the anisotropy parameter that is usually restricted to − ≤ γ ≤ J is thecoupling strength which we set to 1 hereafter. If one sets γ = 0 the fully isotropic case,which possesses an additional symmetry [ H, σ zi ] = 0, is restored. On the other hand,it is also well-known that in the opposite case, i.e. γ = 1, the ground state possesses along-range Neel order which yields σ xi | σ (cid:105) = ( − i | σ (cid:105) (3)and σ yi | σ (cid:105) = ( − i | σ (cid:105) (4)for γ = − γ decreases from 1 to -1, the x - and y -components begin to compete. Its phase diagramis thus given by an x - and y -ferromagnetic states for γ = 1 and − γ = 0 and undergoes a second-order phase transition at thispoint while the gap continuously vanishes [50, 51].
3. Methodology
The complexity of a generic quantum many-body problem grows exponentially withthe size of a system (using the best known methods), making the available numericalroutines computationally demanding. While machine learning has been specificallydesigned to coarse-grain certain information while maintaining relevant and uniquefeatures corresponding to the dataset (reminiscent to the formalism of renormalizationgroup in statistical and high-energy physics [52]) it appears to be perfectly suited foridentification of classical and quantum phases [25, 53, 54]. Indeed, sampled spin- configurations can be mapped to either binary numbers or black and white pixelswhich can be further classified in the form of macroscopic configurations, representingthe class of problems which machine learning has been routinely used for. However,typically for quantum many-body systems we do not have predefined labels, so the useof unsupervised learning is favored. Within this paradigm we search for clusterization orassociative rules that govern the behavior of a system. Unsupervised learning can alsotake measurement data and essentially reconstruct the wave function from individualimages or snapshots. These reconstruction techniques based on machine learning arenow being studied and compared to traditional techniques based on quantum state andquantum process tomography [8, 55, 56, 26, 57].The advantage of using machine learning algorithms for exploration of both classicaland quantum phase transitions is associated with finding certain features relatedto symmetry breaking in microscopic configurations. Particularly, phase transitionsin magnetically ordered systems result in spin directions being randomized by thetemperature—while the corresponding temperature can be detected as a point wherethe magnetization drops. When considering quantum phase transitions one typically robing criticality in quantum spin chains with neural networks In this section, we briefly describe the sampling routine we used for the interacting spinmodels, described by the Hamiltonians (1) and (2). Note that the Hamiltonians (1) and(2) are sparse matrices with most of the elements being zero, as schematically shown inFigure 1 for a system of L = 7 spins. Figure 1.
Heatmap of the matrix that corresponds to a one-dimensional quantumTFIM with the Hamiltonian (1) and L = 7 spins at criticality τ /J = 1. For small systems the exact diagonalization of the Hamiltonians of Eqs. (1) and (2)is possible. Let a 2 L -dimensional vector | g (cid:105) be the ground state of this system. In thecomputational basis the vector | g (cid:105) = (cid:88) i ,i ,...,i L = ↑ , ↓ α i i ...i L | i (cid:105)| i (cid:105) . . . | i L (cid:105) , (5)is purely determined by 2 L complex-valued decomposition components α i i ...i L in thebasis | i k (cid:105) = {| ↑ (cid:105) , | ↓ (cid:105)} , with k = 1 , . . . , L , which are known to give the probabilitydistribution p i i ...i L = | α i i ...i L | of a particular spin configuration | i (cid:105)| i (cid:105) . . . | i L (cid:105) , whichwe refer to as a bitstring and later represent explicitly as strings of 0’s and 1’s. Thus,sampling the physical system specified by the Hamiltonian (1) might be approached bysampling each bitstring with the corresponding probabilities p i i ...i L . robing criticality in quantum spin chains with neural networks We use a neural network architecture that consists of an input layer and one outputneuron, corresponding to a binary classifier. The sampled bitstrings serve as inputdata. Noteworthy, any hidden layers are absent. The output is prescribed to takevalue 0 when an input spin configuration is drawn from the ground state prescribedby τ = 0 .
01 ( γ = − τ i ( γ i ), the neuronis prescribed to take the value 1. The neural network architecture used is shown inFigure 2. ... σ σ σ σ L O W W W W L Inputlayer Outputlayer
Figure 2.
The neural network design. W i denotes the weights connecting the inputlayer neurons with the output neuron, σ i denotes a spin value in the z -basis fed intothe input layer, the solid blue line denotes the sigmoid activation function which forthe output neuron. The linear combination of the spins’ z -projections σ i is fed into the neural networkvia the input layer, followed by a nonlinear activation of the output neuron O = σ (cid:32) L (cid:88) i =1 W i σ i + b i (cid:33) , (6)with σ ( x ) = e x being the sigmoid function and the binary cross-entropy H ( p ) = − N train (cid:88) i =1 y i · log ( p ( y i )) + (1 − y i ) · log (1 − p ( y i )) , (7)serving as the loss-function. Such a simple form of the neural-network architectureresults in high computational speed(s). The neural network outcome is the probability robing criticality in quantum spin chains with neural networks RMSProp algorithm [62].
In our numerical simulations, for chains of L = 20 spins we explore the model describedby Equation (1) throughout the region 0 . J ≤ τ ≤ J with D = 40 steps, τ = { τ i } Di =1 and N = 10 spin configurations to be sampled for each value of τ i . Afterwards, afeed-forward neural network N i is trained to classify the bitstrings sampled for τ = 0 . τ i . Finally, we end up with D − P i , τ i ) with P i ∈ (0 ,
1) being themean output of the neural network evaluated on the samples drawn from the probabilitydistribution given by the ground state of H ( τ i ). In what follows, we show that thevalue of P with respect to τ dramatically changes signalling a phase transition. Weapply a similar procedure to the anisotropic XY model with the anisotropy parameter − ≤ γ ≤ γ = − .
99. The result is then averaged over 40 runs torid possible effects caused by random initialization of the neural networks’ parameters(displayed as shadows in the plots).
4. Results
Below, we present and discuss the results of our numerical simulations, demonstratinghow the neural network architecture and the corresponding algorithm described in theprevious section are capable of probing the phase crossover point for the describedmodels. In Figure 3, we show how our setup performs for a TFIM on a open chain of L = 20 spins. As expected, the neural network learns the order parameter due to thelinearity of the latter as a function of spin projections. Note however, that while theresulting curve is typical of a transverse magnetization curve for TFIM, there was noinformation about the x -projections of the spin measurements in our setup, but onlythe measurements in the z -basis.Unlike in previous studies, for example [63], the simplicity of a neural network usedfor the simulations makes direct visualization of the weights straightforward owing totheir vectorial nature. Figure 4 clearly displays the crossover in the neighborhood ofcriticality, making these results intuitively clear and interpretable in contrast to usualdeep learning routines [64, 65]. Each vertical row in Figure 4 corresponds to a set ofcoefficients z -components of spins are multiplied by before transferring the whole sumto the activation function of the output neuron. Thus, the model actually mimics z -projections of spin configurations given the transverse magnetic field value τ . The latterexplains why the rows in the heatmap are uniform in the ferromagnetic limit and takerandom values in the disordered phase. Note that the boundary coefficients are differentbecause of the open boundary conditions.In Figure 5, we show the result for an anisotropic XY chain of L = 20 spins. In this robing criticality in quantum spin chains with neural networks P ( ) Phase 1 Phase 2
NN output
Figure 3.
The output of trained neural networks as a function of the transversemagnetic field τ , for L = 20 spins on a TFIM chain with open boundary conditions. Sp i n s Figure 4.
Heatmap of the weights W i of the neural networks for a TFIM chain of L = 20 spins with open boundary conditions depending of the magnetic field strength. P ( ) Phase 1 Phase 2
NN output
Figure 5.
The output of trained neural networks as a function of the anisotropyparameter γ for L = 20 spins on an anisotropic XY chain with open boundaryconditions. plot, one can clearly see the phase crossover induced by the change of γ which is a signof a well-studied anisotropy-induced phase transition in an infinite system [66], similarlyto the phase transition induced by the critical value of the magnetic field. Again, whileour algorithm is given information about the z -components of spins, it is capable ofexposing a phase crossover induced by the anisotropy in the x - y plain. robing criticality in quantum spin chains with neural networks
5. Conclusion
In this paper we have considered the simplest neural network architecture with nohidden layers present and applied it to study the finite-size phase crossovers in thequantum transverse field Ising model and the quantum anisotropic XY model on aone-dimensional chain. We were able to distinguish the regions of different phasesusing neural networks without prior knowledge of the phase diagram by observing thecorresponding phase boundary crossover in a finite-size system. Relative simplicity ofthe machine learning setup allowed us to visualize the weights of the correspondingneural network and unambiguously relate this plot to configuration of different spinorderings.
6. Acknowledgements
The authors are thankful to Anastasiia Pervishko and Sebastian Wetzel for fruitfuldiscussions.
7. References [1] Yann LeCun, Yoshua Bengio, and Geoffrey Hinton. Deep learning. nature , 521(7553):436–444,2015.[2] W. Zhang, J. Liu, and T.-C. Wei. Machine learning of phase transitions in the percolation and xy models. Phys. Rev. E , 99:032142, Mar 2019.[3] F. Schindler, N. Regnault, and T. Neupert. Probing many-body localization with neural networks.
Phys. Rev. B , 95:245134, Jun 2017.[4] D.-L. Deng, X. Li, and S. Das Sarma. Machine learning topological states.
Phys. Rev. B ,96:195145, Nov 2017.[5] P. Zhang, H. Shen, and H. Zhai. Machine learning topological invariants with neural networks.
Phys. Rev. Lett. , 120:066401, Feb 2018.[6] J. Venderley, V. Khemani, and E.-A. Kim. Machine learning out-of-equilibrium phases of matter.
Phys. Rev. Lett. , 120:257204, Jun 2018.[7] Ye-Hua Liu and Evert PL Van Nieuwenburg. Discriminative cooperative networks for detectingphase transitions.
Phys. Rev. Lett. , 120(17):176401, 2018.[8] L. Wang. Discovering phase transitions with unsupervised learning.
Phys. Rev. B , 94:195105,Nov 2016.[9] W. Hu, R. R. P. Singh, and R. T. Scalettar. Discovering phases, phase transitions, and crossoversthrough unsupervised machine learning: A critical examination.
Phys. Rev. E , 95:062122, Jun2017.[10] K. Ch’ng, J. Carrasquilla, R. G. Melko, and E. Khatami. Machine learning phases of stronglycorrelated fermions.
Phys. Rev. X , 7:031038, Aug 2017.[11] K. Ch’ng, N. Vazquez, and E. Khatami. Unsupervised machine learning account of magnetictransitions in the Hubbard model.
Phys. Rev. E , 97:013306, Jan 2018.[12] G. Carleo and M. Troyer. Solving the quantum many-body problem with artificial neural networks.
Science , 355(6325):602–606, 2017.[13] I. Glasser, N. Pancotti, M. August, I. D. Rodriguez, and J. I. Cirac. Neural-network quantumstates, string-bond states, and chiral topological states.
Phys. Rev. X , 8:011006, Jan 2018.[14] Z. Cai and J. Liu. Approximating quantum many-body wave functions using artificial neuralnetworks.
Phys. Rev. B , 97:035116, Jan 2018. robing criticality in quantum spin chains with neural networks [15] Juan Carrasquilla, Giacomo Torlai, Roger G Melko, and Leandro Aolita. Reconstructing quantumstates with generative models. Nature Machine Intelligence , 1(3):155–161, 2019.[16] Mohamed Hibat-Allah, Martin Ganahl, Lauren E Hayward, Roger G Melko, and Juan Carrasquilla.Recurrent neural network wavefunctions. arXiv preprint arXiv:2002.02973 , 2020.[17] Matthew JS Beach, Isaac De Vlugt, Anna Golubeva, Patrick Huembeli, Bohdan Kulchytskyy,Xiuzhe Luo, Roger G Melko, Ejaaz Merali, and Giacomo Torlai. Qucumber: wavefunctionreconstruction with neural networks.
SciPost Physics , 7(1):009, 2019.[18] Giuseppe Carleo, Ignacio Cirac, Kyle Cranmer, Laurent Daudet, Maria Schuld, Naftali Tishby,Leslie Vogt-Maranto, and Lenka Zdeborov´a. Machine learning and the physical sciences.
Rev.Mod. Phys. , 91:045002, Dec 2019.[19] Juan Carrasquilla. Machine learning for quantum matter. arXiv preprint arXiv:2003.11040 , 2020.[20] G. Torlai, G. Mazzola, J. Carrasquilla, M. Troyer, R. Melko, and G. Carleo. Neural-networkquantum state tomography.
Nat. Phys. , 14(5):447–450, 2018.[21] M. Koch-Janusz and Z. Ringel. Mutual information, neural networks and the renormalizationgroup.
Nat. Phys. , 14(6):578–582, 2018.[22] A. Rocchetto, E. Grant, S. Strelchuk, G. Carleo, and S. Severini. Learning hard quantumdistributions with variational autoencoders. npj Quantum Inf. , 4(1):28, 2018.[23] L.-F. Arsenault, A. Lopez-Bezanilla, O. A. von Lilienfeld, and A. J. Millis. Machine learning formany-body physics: The case of the Anderson impurity model.
Phys. Rev. B , 90:155136, Oct2014.[24] G. Torlai and R. G. Melko. Learning thermodynamics with Boltzmann machines.
Phys. Rev. B ,94:165134, Oct 2016.[25] J. Carrasquilla and R. G. Melko. Machine learning phases of matter.
Nat. Phys. , 13(5):431, 2017.[26] E. P. L. van Nieuwenburg, Y.-H. Liu, and S. D. Huber. Learning phase transitions by confusion.
Nat. Phys. , 13(5):435, 2017.[27] S. J. Wetzel. Unsupervised learning of phase transitions: From principal component analysis tovariational autoencoders.
Phys. Rev. E , 96:022140, Aug 2017.[28] Hiroki Saito. Solving the Bose-Hubbard model with machine learning.
Journal of the PhysicalSociety of Japan , 86(9):093001, 2017.[29] K. Mills and I. Tamblyn. Deep neural networks for direct, featureless learning through observation:The case of two-dimensional spin models.
Phys. Rev. E , 97:032119, Mar 2018.[30] K. Choo, G. Carleo, N. Regnault, and T. Neupert. Symmetries and many-body excitations withneural-network quantum states.
Phys. Rev. Lett. , 121:167204, Oct 2018.[31] M. Bukov, A. G. R. Day, D. Sels, P. Weinberg, A. Polkovnikov, and P. Mehta. Reinforcementlearning in different phases of quantum control.
Phys. Rev. X , 8:031086, Sep 2018.[32] Y.-H. Liu and E. P. L. van Nieuwenburg. Discriminative cooperative networks for detecting phasetransitions.
Phys. Rev. Lett. , 120:176401, Apr 2018.[33] A. A. Shirinyan, V. K. Kozin, J. Hellsvik, M. Pereiro, O. Eriksson, and D. Yudin. Self-organizingmaps as a method for detecting phase transitions and phase identification.
Phys. Rev. B ,99:041108, Jan 2019.[34] L. Burzawa, S. Liu, and E. W. Carlson. Classifying surface probe images in strongly correlatedelectronic systems via machine learning.
Phys. Rev. Materials , 3:033805, Mar 2019.[35] Tom Westerhout, Nikita Astrakhantsev, Konstantin S Tikhonov, Mikhail I Katsnelson, andAndrey A Bagrov. Generalization properties of neural network approximations to frustratedmagnet ground states.
Nature Communications , 11(1):1–8, 2020.[36] Alexey Uvarov, Jacob Biamonte, and Dmitry Yudin. Variational Quantum Eigensolver forFrustrated Quantum Systems. arXiv e-prints , page arXiv:2005.00544, May 2020.[37] L. Huang and L. Wang. Accelerated Monte Carlo simulations with restricted Boltzmann machines.
Phys. Rev. B , 95:035105, Jan 2017.[38] Y. Nagai, H. Shen, Y. Qi, J. Liu, and L. Fu. Self-learning Monte Carlo method: Continuous-timealgorithm.
Phys. Rev. B , 96:161102, Oct 2017. robing criticality in quantum spin chains with neural networks [39] H. Suwa, J. S. Smith, N. Lubbers, C. D. Batista, G.-W. Chern, and K. Barros. Machine learningfor molecular dynamics with strongly correlated electrons. Phys. Rev. B , 99:161107, Apr 2019.[40] Isaac JS De Vlugt, Dmitri Iouchtchenko, Ejaaz Merali, Pierre-Nicholas Roy, and Roger G Melko.Reconstructing quantum molecular rotor ground states. arXiv preprint arXiv:2003.14273 , 2020.[41] EM Inack, GE Santoro, L Dell’Anna, and S Pilati. Projective quantum monte carlo simulationsguided by unrestricted neural network states.
Physical Review B , 98(23):235145, 2018.[42] B McNaughton, MV Miloˇsevi´c, A Perali, and S Pilati. Boosting monte carlo simulations of spinglasses using autoregressive neural networks. arXiv preprint arXiv:2002.04292 , 2020.[43] S. A. Pikin and V. M. Tsukernik. The thermodynamics of linear spin chains in a transversemagnetic field.
J. Exp. Theor. Phys. , 23(5), 1966.[44] U. Brandt and K. Jacoby. The transverse correlation function of anisotropic X − Y -chains: Exactresults at T = ∞ . Z. Phys. B , 26(3):245–252, 1977.[45] S. Katsura. Statistical mechanics of the anisotropic linear Heisenberg model.
Phys. Rev. ,127(5):1508, 1962.[46] Theodore D Schultz, Daniel C Mattis, and Elliott H Lieb. Two-dimensional ising model as asoluble problem of many fermions.
Reviews of Modern Physics , 36(3):856, 1964.[47] Pierre Pfeuty. The one-dimensional ising model with a transverse field.
ANNALS of Physics ,57(1):79–90, 1970.[48] S. Sachdev.
Quantum phase transitions . Cambridge University Press, 2011.[49] M. Suzuki. Generalized Trotter’s formula and systematic approximants of exponential operatorsand inner derivations with applications to many-body problems.
Commun. Math. Phys. ,51(2):183–190, 1976.[50] Elliott Lieb, Theodore Schultz, and Daniel Mattis. Two soluble models of an antiferromagneticchain.
Annals of Physics , 16(3):407–466, 1961.[51] Qiang Luo, Jize Zhao, and Xiaoqun Wang. Fidelity susceptibility of the anisotropic xy model:The exact solution.
Physical Review E , 98(2):022106, 2018.[52] Pankaj Mehta and David J Schwab. An exact mapping between the variational renormalizationgroup and deep learning. arXiv preprint arXiv:1410.3831 , 2014.[53] A. Tanaka and A. Tomiya. Detection of phase transition via convolutional neural networks.
J.Phys. Soc. Jpn. , 86(6):063001, 2017.[54] A. Morningstar and R. G Melko. Deep learning the Ising model near criticality.
J. Mach. Learn.Res. , 18:5975, 2018.[55] P. Broecker, F. F. Assaad, and S. Trebst. Quantum phase recognition via unsupervised machinelearning. arXiv preprint arXiv:1707.00663 , 2017.[56] P. Huembeli, A. Dauphin, and P. Wittek. Identifying quantum phase transitions with adversarialneural networks.
Phys. Rev. B , 97(13):134109, 2018.[57] Adriano Macarone Palmieri, Egor Kovlakov, Federico Bianchi, Dmitry Yudin, Stanislav Straupe,Jacob D. Biamonte, and Sergei Kulik. Experimental neural network enhanced quantumtomography. npj Quantum Information , 6(1), Feb 2020.[58] M. Vojta. Quantum phase transitions.
Rep. Prog. Phys. , 66(12):2069, 2003.[59] J. Tsuda, Y. Yamanaka, and H. Nishimori. Energy gap at first-order quantum phase transitions:An anomalous case.
J. Phys. Soc. Jpn. , 82(11):114004, 2013.[60] L. C. Venuti and P. Zanardi. Quantum critical scaling of the geometric tensors.
Phys. Rev. Lett. ,99(9):095701, 2007.[61] B. Damski. Fidelity approach to quantum phase transitions in quantum Ising model. In
QuantumCriticality in Condensed Matter: Phenomena, Materials and Ideas in Theory and Experiment ,pages 159–182. World Scientific, 2016.[62] Geoffrey Hinton, Nitish Srivastava, and Kevin Swersky. Neural networks for machine learninglecture 6a overview of mini-batch gradient descent.
Coursera lecture slides , 2012.[63] S. Arai, M. Ohzeki, and K. Tanaka. Deep neural network detects quantum phase transition.
J.Phys. Soc. Jpn. , 87(3):033001, 2018. robing criticality in quantum spin chains with neural networks [64] G. Montavon, W. Samek, and K.-R. M¨uller. Methods for interpreting and understanding deepneural networks. Digital Signal Processing , 73:1–15, 2018.[65] K. Liu, J. Greitemann, and L. Pollet. Learning multiple order parameters with interpretablemachines.
Phys. Rev. B , 99:104410, Mar 2019.[66] HT Quan. Finite-temperature scaling of magnetic susceptibility and the geometric phase in thexy spin chain.