aa r X i v : . [ c ond - m a t . s t a t - m ec h ] F e b Information Entropy of Liquid Metals
M. C. Gao † and M. Widom ∗ , ‡ † National Energy Technology Laboratory, Albany OR 97321, USA; AECOM, P.O. Box1959, Albany OR 97321, USA ‡ Department of Physics, Carnegie Mellon University, Pittsburgh, PA 15217, USA
E-mail: [email protected]
Phone: 412-268-76451 bstract
Correlations reduce the configurational entropies of liquids below their ideal gaslimits. By means of first principles molecular dynamics simulations, we obtain accu-rate pair correlation functions of liquid metals, then subtract the mutual informationcontent of these correlations from the ideal gas entropies to predict the absolute en-tropies over a broad range of temperatures. We apply this method to liquid aluminumand copper and demonstrate good agreement with experimental measurements, thenwe apply it to predict the entropy of a liquid aluminum-copper alloy. Corrections dueto electronic entropy and many-body correlations are discussed.
Introduction
The remarkable equivalence of information and entropy, as recognized by Shannon andJaynes, implies that the atomic coordinates of a solid or liquid contain all the informa-tion that is needed to calculate its configurational entropy. Qualitatively, ordered structuresare fully described with little information. For example, specifying a crystal lattice and itsatomic basis uniquely determines the positions of infinitely many atoms in a crystallographicstructure using a finite amount of information, so the entropy per atom vanishes. Mean-while a disordered structure requires separately specifying information about each atom,which implies a finite entropy per atom. For example, to specify the distribution of chemicalspecies in a random equiatomic binary solid solution requires log and the interactions.Given a distribution of discrete states i with probabilities p i , the expected information2equired to specify the actual state is S/k B = − X i p i ln p i . (1)By choosing the natural logarithm and assigning units of k B we identify the informationas entropy, as suggested by von Neumann. In quantum statistical mechanics we take theBoltzmann probability distribution, p i = exp ( − E i /k B T ) /Z , with the partition function Z as the normalizing factor. Classically, the distribution becomes continuous. In the canonicalensemble the N -particle entropy in volume V becomes S N /k B = − N ! Z Y i d r i d p i f N ln ( h N f N ) . (2)where f N ( r , p , . . . , r N , p N ) is the N -body probability density, as a function of the atomicpositions r i and momenta p i . This expression, including the factors of Planck’s constant h ,can be derived as the high temperature limit of the quantum expression Eq. (1).Applying Eq. (2) to an uncorrelated fluid of density ρ = N/V yields the entropy peratom of the classical ideal gas, S ideal /k B = 52 − ln ( ρ Λ ) . (3)The 5 / / f ( p ) = ρ (2 πmk B T ) − / exp ( − p / mk B T ),plus an additional 2 / N ! ≈ N ln N − N . The quantum de Broglie wavelength Λ = p h / πmk B T diverges atlow T , so this classical S ideal approaches −∞ . However, the quantization of energy levelsin a finite volume yields the low temperature limit S → T → Thus S ideal is anabsolute entropy, consistent with the conventional choice of S=0 at T=0. ∗ ∗ Still, S ideal → −∞ when we take the thermodynamic limit of infinite volume prior to the low temperaturelimit T →
0. However, the ideal gas is not a suitable model for real matter at low temperature. More realistic n -body distribution functions, g ( n ) N , as S N /N k B = s + s + s + . . . , (4)with the n -body terms s = 32 − ln ( ρ Λ ) , (5) s = − ρ Z d r g (2) N ln g (2) N , (6) s = − ρ Z d r g (3) N ln ( g (3) N /g (2) N g (2) N g (2) N ) . (7)The subscripts N indicate that the correlation functions are defined in the canonical en-semble with fixed number of atoms N . Equations (5-7) appear superficially similar to avirial-type low density expansion. However, we use correlation functions that are nomi-nally exact, not their low density virial approximations, so in fact the series is an expan-sion in cumulants of the many-body probability distribution. Truncation of the series isaccurate if a higher many-body correlation function can be approximated by the prod-ucts of fewer-body correlations. For example, the Kirkwood superposition approximation g (3) (1 , , ≈ g (2) (1 , g (2) (2 , g (2) (1 ,
3) causes s to vanish.Mutual information measures how similar a joint probability distribution is to the productof its marginal distributions. In the case of a liquid structure, we may compare the two-body joint probability density ρ (2) ( r , r ) = ρ g ( | r − r | ) with its single-body marginal, ρ ( r ). The mutual information I [ ρ (2) ( r , r )] = 1 N Z d r d r ρ (2) ( r , r ) ln ( ρ (2) ( r , r ) /ρ ( r ) ρ ( r )) (8)tells us how much information g ( r ) gives us concerning the positions of atoms at a distance models with a low density of states at low energy ( e.g. harmonic solids) exhibit vanishing low temperatureentropy, consistent with the usual third law of thermodynamics. from another atom. Mutual information is nonnegative definite. We recognize the term s in Eq. (6) as the negative of the mutual information, with the factor of 1 / s reduces the liquid state entropy relative to s bythe mutual information content of the radial distribution function g ( r ).Pair correlation functions for liquid metals obtained through ab-initio molecular dynam-ics (AIMD) simulation can predict the configurational entropy through Eqs. (5-7), truncatedat the two-body level. We demonstrate this method for liquid aluminum and copper, show-ing good agreement with experimentally measured absolute entropies over broad ranges oftemperature. Corrections to the entropy due to electronic excitations and three-body corre-lations are discussed. Finally, we apply the method to a liquid aluminum-copper alloy. Theoretical methods
Entropy expansion
Direct application of the formalism Eqs. (5-7) is inhibited by constraints such as ρ n Z Y i d r i g ( n ) N = N !( N − n )! (9)that lead to long-range (large r ) contributions to the two- and three-body integrals. Nettle-ton and Green, and Raveche, recast the distribution function expansion in the grand canonical ensemble and obtained expressions that are better convergent. We follow Baranyaiand Evans and utilize the constraint (9) to rewrite the two-body term as s = 12 + 12 ρ Z d r [ g (2) − − ρ Z d r g (2) ln g (2) . (10)The combined integrand { [ g (2) ( r ) − − g (2) ( r ) ln g (2) ( r ) } falls off rapidly, so that the sum of thetwo integrals converges rapidly as the range of integration extends to large r . Furthermore,5he combined integral is ensemble invariant, which allows us to substitute the grand canonicalensemble radial distribution function g ( r ) in place of the canonical g (2) N . The same trickapplies to the three-body term, s = 16 + 16 ρ Z d r [ g (3) − g (2) g (2) + 3 g (2) − − ρ Z d r g (3) ln ( g (3) /g (2) g (2) g (2) ) . (11)In the grand canonical ensemble, the first two terms in Eq. (10) arise from fluctuationsin the number of atoms, N , and can be evaluated in terms of the isothermal compressibility κ T . We define ∆ S fluct [ g ( r )] /k B ≡
12 + 12 ρ Z d r [ g ( r ) −
1] = 12 ρk B T κ T , (12)and note that it is positive definite. The remaining term is the entropy reduction due tothe two-body correlation. As noted above, the mutual information content of the radialdistribution function g ( r ) reduces the entropy by∆ S info [ g ( r )] /k B ≡ − ρ Z d r g ( r ) ln g ( r ) . (13)The complete two-body term is now s = ∆ S fluct /k B + ∆ S info /k B . The corresponding three-body term in Eq. (11) reduces to a difference of three- and two-body entropies, and its signis not determined.Notice the constant term 5 / S ideal (Eq. (3)), while the one-bodyentropy, s (Eq. (5)), instead contains 3 /
2. The contribution of 1 / s as given by Eq. (10),together with an added 1 / · · · = 1 / andto make connection with the ideal gas, we could add the entire series 1 / / · · · = 1 to s and write S N /N k B = S ideal /k B + ( s − /
2) + ( s − /
6) + · · · (14)6hich is equivalent to Eq. (4).
Ab-initio molecular dynamics simulation
To provide the liquid state correlation functions needed for our study we perform ab-initio molecular dynamics (AIMD) simulations based on energies and forces calculated from firstprinciples electronic density functional theory (DFT). AIMD provides a suitable compromisebetween accuracy and computational efficiency. It accurately predicts liquid state densi-ties and pair correlation functions with no adjustable parameters or empirical interatomicinteractions. We apply the plane-wave code VASP in the PBEsol generalized gradientapproximation, utilizing a single k -point in a simulation cell of 200 atoms.Simulations are performed at fixed volume for each temperature. In the case of Cu we fixthe volumes at the experimental values. Because experimental values are not available overthe needed temperature range for Al, and are not available at all for AlCu, we determinedthese volumes by the condition that the average total pressure (including the kinetic term)vanishes. The predicted volumes for Al are insensitive to the energy cutoff of our plane-wavebasis set, so we use the default value of 240 eV. Over the temperature range where volumesare available for Al, we reproduce the experimental values to within 0.5%. For AlCu anelevated energy cutoff was required. We found 342 eV, which is 25% above the default forCu of 273 eV, to be sufficient to achieve convergence. Given a suitable volume, the energycutoff has minimal impact on our simulated correlation functions and predicted entropies.Pair correlation functions are collected as histograms in 0.01 ˚A bins and subsequentlysmeared with a Gaussian of width σ =0.025 ˚A. Our run durations for data collection were50 ps for Al and Cu, and 20 ps for AlCu. All structures were thoroughly equilibrated priorto data collection. 7 lectronic entropy The electronic density of states D ( E ), which comes as a byproduct of first principles calcu-lation, determines the electronic entropy. At low temperatures, all states below the Fermienergy E F are filled and all states above are empty. At finite temperature, single electronexcitations vacate states below E F and occupy states above, resulting in the Fermi-Diracoccupation function f T ( E ) = 1exp [( E − µ ) /k B T ] + 1 (15)( µ is the electron chemical potential). Fractional occupation probability creates an electroniccontribution to the entropy,∆ S elec /k B = − Z D ( E )[ f T ( E ) ln f T ( E ) + (1 − f T ( E )) ln (1 − f T ( E ))] . (16)We apply this equation to representative configurations drawn from our liquid metal simu-lations, with increased k -point density (a 2 × × π / D ( E F ) k B2 T , which de-pends only on the density of states at the Fermi level. However, at the high temperaturesof liquid metals, the electronic entropy requires the full integral as given in Eq. (16), ratherthan its low temperature approximation. Results and discussion
Application to pure liquid metals
Figure 1a displays a simulated radial distribution function g ( r ) for liquid Al at T=1000K.Integrated contributions to the entropy are shown in Fig. 1b. The excluded volume regionbelow 2 ˚A, where g ( r ) vanishes, does not contribute to ∆ S info , but it does contribute, neg-8tively, to ∆ S fluct . Strong peaks with g ( r ) > S fluct . They alsocontribute positively to mutual information, and hence negatively to ∆ S info , reducing theentropy. Minima with g ( r ) < S fluct is close to zero, as is expected for a liquid metalwith low compressibility (for liquid Al, values of ρk B T κ T ∼ . − .
04 are reported ). Incontrast, the entropy loss due to mutual information is more than 2 k B . g ( R ) R [Å] -4-202 ∆ S [ k B ] ∆ S Fluct ∆ S Info
Sum (a)(b)
Figure 1: (a) Radial distribution function g ( r ) of liquid Al at T=1000 K. (b) Contributionsto the entropy of liquid Al integrated from r = 0 up to R .Repeating this calculation at several temperatures, and choosing the values of ∆ S fluct and∆ S info obtained at R = 12 ˚A, we predict the absolute entropy as a function of temperatureas displayed in Fig. 2. Our predictions lie close to experimental values over the entire9imulated temperature range, however there are systematic discrepancies. Our value is toohigh at low temperatures, and too low at high temperatures. Including a further correctiondue to electronic entropy (not shown) improves the agreement at high temperature whileworsening it at low. As noted in the discussion surrounding Eq. (14), we have arbitrarilyincluded the constants 1 / , / , . . . belonging to the fluctuation terms such as Eq. (12), inthe ideal gas entropy. Removing those terms, and instead plotting ( s + s ) k B + ∆ S elec , wefind excellent agreement at low T and slight underestimation at high T. T [K] S [ J / K / m o l ] ExperimentalS
Ideal S Ideal + (s − ½) k B (s + s ) k B + ∆ S elec Figure 2: Entropy of liquid Al, comparing the experimental values with various approxima-tions: the ideal gas (Eq. (3)); ideal gas with pair corrections (Eq. (14)); single-body entropywith pair correction and electronic entropy, ( s + s ) k B + ∆ S elec .We find rather similar behavior in the case of liquid copper (see Fig. 3). Here, theagreement with experiment is less close, especially at low temperatures. Presumably we10ust include many-body corrections such as s or higher that are likely to be stronger atlow temperatures. The d -orbitals of copper lie close to the Fermi surface, possibly causingdeviations from the Kirkwood superposition approximation that increase the value of s .Excitations of d -electrons also contribute significantly to ∆ S elec causing a faster than linearincrease with T. T [K] S [ J / K / m o l ] ExperimentalS
Ideal S Ideal + (s − ½) k B (s + s ) k B + ∆ S Elec
Figure 3: Entropy of liquid Cu, comparing the experimental values with various approxima-tions: the ideal gas (Eq. (3)); ideal gas with pair corrections (Eq. (14)); single-body entropywith pair correction and electronic entropy, ( s + s ) k B + ∆ S elec . Application to binary AlCu liquid alloy
Finally, we turn to a liquid aluminum-copper alloy. As demonstrated by Hernando andapplied by Laird and Haymet, Eqs. (12) and (13) generalize naturally to multicomponent11ystems, with mole fraction x α for species α and with partial pair distribution functions g αβ ( r ) between species α and β :∆ S fluct [ g αβ ( r )] /k B = 12 + 12 ρ X αβ x α x β Z d r [ g αβ ( r ) − , (17)and ∆ S info [ g αβ ( r )] /k B = − ρ X αβ x α x β Z d r g αβ ( r ) ln g αβ ( r ) . (18)We set s = ∆ S fluct /k B + ∆ S info /k B as before. We also need to revise the ideal gas entropy: S ideal /k B = 52 − X α x α ln (cid:0) ρx α Λ α (cid:1) . (19)Notice that the ideal mixing entropy − k B P α x α ln x α is included in this expression for S ideal .Simulated distribution functions and their integrals are displayed in Fig. 4, at T=1373K.Note the first peak of the interspecies correlation g AlCu ( r ) is much stronger than the in-traspecies correlations, indicating strong chemical order. The correlations g αβ reduce the en-tropy by 2 . k B , with the interspecies Al-Cu dominating because ( i ) it exhibits the strongestoscillations, and ( ii ) it enters twice into Eqs. (17) and (18). We can isolate the contributionof the average liquid structure by defining ¯ g ( r ) = P αβ x α x β g αβ ( r ) and setting∆ S ave ≡ ∆ S fluct [¯ g ] + ∆ S info [¯ g ] , (20)which converges quickly to ∆ S ave = − . k B . Meanwhile, the contribution due to chemicalorder is obtained by integrating the information content contained in the relative frequenciesof αβ pairs at every separation r ,∆ S chem /k B ≡ − ρ X αβ x α x β Z d r g αβ ( r ) ln ( g αβ ( r ) / ¯ g ( r )) . (21)This sum converges quickly to ∆ S chem = − . k B , which roughly counteracts the k B ln 212 g α β ( R ) AlAlAlCuCuCu0 2 4 6 8 10 12
R [Å] -3-2-1012 ∆ S [ k B ] ∆ S Fluct
Sum
R [Å] -3-2-1012 ∆ S [ k B ] ∆ S AlAl ∆ S AlCu ∆ S CuCu (a)(b)
Figure 4: (a) Partial radial distribution functions g αβ ( r ) of liquid AlCu alloy at T=1373K.(b) Contributions to the entropy of liquid AlCu integrated from r = 0 up to R .13
500 1000 1500 2000
T [K] S [ J / K / m o l ] Al monoCu monoS
Ideal S Ideal + (s − ½) k B (s + s ) k B + ∆ S Elec
Figure 5: Entropy of liquid AlCu, comparing the experimental values of elemental Al andCu with various approximations: the ideal gas (Eq. (3)); ideal gas with pair corrections(Eq. (14)); single-body entropy with pair correction and electronic entropy, ( s + s ) k B +∆ S elec . 14deal entropy of mixing. Notice the identity ∆ S ave + ∆ S chem = ∆ S fluct + ∆ S info .Beyond the entropy losses ∆ S ave and ∆ S chem , there is a small additional loss of elec-tronic entropy associated with the chemical bonding of Al and Cu, which depresses theelectronic density of states at the Fermi level. At T=1373K we find values of 4.3, 3.2, and3.1 states/eV/atom for liquid Al, AlCu, and Cu, respectively. This results in a negativeelectronic entropy of mixing of ∆ S e = − . k B .The entropy of the liquid alloy (see Fig. 5) lies rather close to the average entropies ofAl and Cu individually. We do not have experimental values to compare with. Conclusions
This study demonstrates the feasibility of absolute entropy calculation based on ab-initio simulated pair correlation functions. Given the absolute entropy, we could use the ab-initio total energies to calculate absolute free energies. Here, we focus on the reduction of entropyfrom the ideal gas value by the mutual information content of the pair radial distributionfunction. In comparison with experimental values for pure elements, we show good agreementin the case of Al and slightly worse agreement in the case of Cu. We also applied it to thecase of a liquid AlCu alloy, and found that strong chemical order counteracts the ideal mixingentropy.Two implementations of the distribution function expansion were compared, both of themtruncated at the pair level. Equation (14), adds the series 1 / / · · · = 1 to the singleparticle entropy s to reach S ideal , but then must subtract 1 / S fluct , while Eq. (4)keeps the 1 / S fluct . In the case of liquid Al, the latter approach yields improvedagreement, as shown in Fig. 2. However, in the case of liquid Cu, the former approach isfavorable at low T, while the latter is best at high T. This temperature dependence is possiblydue to angular correlations created by anisotropic Cu d -orbitals leading to a breakdown ofthe Kirkwood superposition approximation at low T.15eeping the 1 / S fluct results in this term nearly vanishing (it is positive definitebut numerically small). The fluctuation term contained in s would likewise be small. Thus,in the approach of Eq. (4), ∆ S fluct serves to improve convergence of the sum of integralsin Eqs. (12) and (13), but ultimately the entropy is primarily determined by the mutualinformation.This method has been previously applied to model systems such as hard sphere andLennard-Jones fluids, and to the one component plasma, as well as to simulations ofreal fluids using embedded atom potentials. It can also be applied with experimentallydetermined correlation functions.
An analagous expansion exists for the lattice gas. Dzugutov utilized the method in a study reporting an empirical scaling relation betweenexcess entropy and diffusion coefficients. However it has only rarely been applied in con-junction with ab-initio molecular dynamics. It is clear from the example of liquid Cu, aswell as from the work of others, that many-body terms are required to achieve highaccuracy in some cases. Fortunately these are available, in principle, from AIMD.Beyond assessing the impact of many-body terms, certain other details remain to beoptimized in our calculations. We apply the PBEsol generalized gradient approximationfor the exchange correlation functional because it predicts good atomic volumes (tested forsolids ), but we have not tested the sensitivity of our results to other choices of functional.We consistently use systems of 200 atoms, but we have not tested the convergence of theentropy with respect to the number of atoms. After further testing and optimization, ourmethods could be used to develop a database of calculated liquid state entropies, both forpure metals and for alloys of interest.Application to fluids in external fields and at interfaces is possible by generalizingthe series in Eqs. (5-7) to allow for spatially varying local density ρ ( r ). In this case we mustset s = 32 − N Z d r ρ ( r ) ln ( ρ ( r )Λ ) . (22)Similarly, the full, spatially varying and anisotropic, two-body density ρ (2) ( r , r ) (see Eq. 8)16s required in place of the translation-invariant radial distribution function g ( r ). Acknowledgement
MCG was supported by NETL’s Research and Innovation Center’s Innovative Process Tech-nologies (IPT) Field Work Proposal and under the RES contract de-fe0004000. MW wassupported by the Department of Energy under grant de-sc0014506. We thank R. B. Griffiths,V. Siddhu, M. Deserno and C. J. Langmead for discussions on mutual information, and R.H. Swendsen and B. Widom for discussions on liquid and gas state statistical mechanics. † References (1) Shannon, C. E. A Mathematical Theory of Communication.
Bell System TechnicalJournal , , 379–423.(2) Jaynes, E. T. Information Theory and Statistical Mechanics. Phys. Rev. , ,620–630.(3) Petz, D. In John von Neumann and the Foundations of Quantum Physics ; R´edei, M.,St¨oltzner, M., Eds.; Springer Netherlands: Dordrecht, 2001; pp 83–96.(4) von Neumann, J. Thermodynamik Quantenmechanischer Gesamtheiten.
Nachrichtenvon der Gesellschaft der Wissenschaften zu Gottingen, Mathematisch-PhysikalischeKlasse , , 273–291. † Disclaimer: This project was funded by the Department of Energy, National Energy Technology Lab-oratory, an agency of the United States Government, through a support contract with AECOM. Neitherthe United States Government nor any agency thereof, nor any of their employees, nor AECOM, nor any oftheir employees, makes any warranty, expressed or implied, or assumes any legal liability or responsibilityfor the accuracy, completeness, or usefulness of any information, apparatus, product, or process disclosed, orrepresents that its use would not infringe privately owned rights. Reference herein to any specific commer-cial product, process, or service by trade name, trademark, manufacturer, or otherwise, does not necessarilyconstitute or imply its endorsement, recommendation, or favoring by the United States Government or anyagency thereof. The views and opinions of authors expressed herein do not necessarily state or reflect thoseof the United States Government or any agency thereof.
The Molecular Theory of Fluids ; North-Holland, Amsterdam, 1952.(6) McQuarrie, D. A.
Statistical Mechanics ; Harper & Row, New York, 1973; Chapter 5-3,10-7 and 13-2.(7) Kirkwood, J. G. Quantum Statistics of Almost Classical Assemblies.
Phys. Rev. , , 31–37.(8) Kirkwood, J. G. Quantum Statistics of Almost Classical Assemblies. Phys. Rev. , , 116–117.(9) Griffiths, R. B. Microcanonical Ensemble in Quantum Statistical Mechanics. J. Math.Phys. , , 1447–1461.(10) Widom, B. Statistical Mechanics: a Concise Introduction for Chemists ; Cambridge,2002; Chapter 5.(11) Yvon, J.
Correlations and Entropy in Classical Statistical Mechanics ; Pergamon, Ox-ford, 1969.(12) Baranyai, A.; Evans, D. J. Direct Entropy Calculation from Computer Simulation ofLiquids.
Phys. Rev. A , , 3817–3822.(13) Cover, T. M.; Thomas, J. A. Elements of Information Theory ; Wiley, New jersey, 2006.(14) Rowlinson, J. S.; Widom, B.
Molecular Theory of Capilarity ; Oxford, 1982.(15) Nettleton, R. E.; Green, M. S. Expression in Terms of Molecular Distribution Functionsfor the Entropy Density in an Infinite System.
J. Chem. Phys. , , 1365–1370.(16) Raveche, H. J. Entropy and Molecular Correlation Functions in Open Systems. I.Derivation. J. Chem. Phys. , , 2242–2250.(17) Raveche, H. J.; Mountain, R. D. Entropy and Molecular Correlation Functions in OpenSystems. II Two- and Three-Body Correlations. J. Chem. Phys. , , 2250–2255.1818) Kresse, G.; Joubert, D. From Ultrasoft Pseudopotentials to the Projector Augmented-Wave Method. Phys. Rev. B , , 1758–75.(19) Perdew, J. P.; Ruzsinszky, A.; Csonka, G. I.; Vydrov, O. A.; Scuseria, G. E.; Con-stantin, L. A.; Zhou, X.; Burke, K. Restoring the Density-Gradient Expansion forExchange in Solids and Surfaces. Phys. Rev. Lett. , , 136406.(20) Haynes, W. M., Ed. CRC Handbook of Chemistry and Physics , 92nd ed.; CRC Press,2011.(21) Jakse, N.; Pasturel, A. Liquid Aluminum: Atomic Diffusion and Viscosity from AbInitio Molecular Dynamics.
Sci. Rep. , , 3135.(22) Hultgren, R. R. Selected Values of the Thermodynamic Properties of the Elements ;American Society for Metals, Ohio, 1973.(23) Chase, M.; Davies, C.; Downey, J.; Frurip, D.; McDonald, R.; Syverud, A.http://kinetics.nist.gov/janaf/, 1985.(24) Hernando, J. A. Thermodynamic Potentials and Distribution Functions.
Mol. Phys. , , 327–336.(25) Laird, B. B.; Haymet, A. D. J. Calculation of the Entropy of Binary Hard SphereMixtures from Pair Correlation Functions. J. Chem. Phys. , , 2153–2155.(26) Widom, M. Entropy and Diffuse Scattering: Comparison of NbTiVZr and CrMoNbV. Met. Mat. Trans. A , , 3306–3311.(27) Laird, B. B.; Haymet, A. D. J. Calculation of the Entropy from Multiparticle Correla-tion Functions. Phys. Rev. A , , 5680–5689.(28) Hoyt, J. J.; Asta, M.; Sadigh, B. Test of the Universal Scaling Law for the DiffusionDoefficient in Liquid Metals. Phys. Rev. Lett. , , 594–7.1929) Wilson, S. R.; Mendelev, M. I. Dependence of Solid-Liquid Interface Free Energy onLiquid Structure. Model. Simul. Mater. Sci. , , 65004.(30) Wallace, D. C. On the Role of Density Fluctuations in the Entropy of a Fluid. J. Chem.Phys. , , 2282–2284.(31) Yokoyama, I.; Tsuchiya, S. Excess Entropy, Diffusion Coefficient, Viscosity Coefficientand Surface. Tension of Liquid Simple Metals from Diffraction Data. Mat. Trans. , , 67–72.(32) Gangopadhyay, A. K.; Kelton, K. F. Recent Progress in Understanding High Tempera-ture Dynamical Properties and Fragility in Metallic Liquids, and their Connection withAtomic Structure. J. Materi. Res. , , 2638–2657.(33) Prestipino, S.; Giaquinta, P. V. Statistical Entropy of a Lattice-Gas Model: Multipar-ticle Correlation Expansion. J. Stat. Phys. , , 135–167.(34) Dzugutov, M. A Universal Scaling Law for Atomic Diffusion in Condensed Matter. Nature , , 137–139.(35) Baranyai, A.; Evans, D. J. Three-Particle Contribution to the Configurational Entropyof Simple Fluids. Phys. Rev. A , , 849–857.(36) Csonka, G. I.; Perdew, J. P.; Ruzsinszky, A.; Philipsen, P. H. T.; Leb`egue, S.; Paier, J.;Vydrov, O. A.; ´Angy´an, J. G. Assessing the Performance of Recent Density Functionalsfor Bulk Solids. Phys. Rev. B , , 155107.(37) Ewing, R. H. The Free Energy of the Crystal-Melt Interface from the Radial Distribu-tion Function. Phil. Mag. , , 779–784.20 g ( R ) R [Å] -4-202 ∆ S [ k B ] ∆ S Fluct ∆ S Info