An overview of generalized entropic forms
aa r X i v : . [ c ond - m a t . s t a t - m ec h ] F e b An overview of generalized entropic forms
V. Ili´c
Mathematical Institute of the Serbian Academy of Sciences and Arts, Kneza Mihaila36, 11000 Beograd, Serbia
J. Korbel
Section for Science of Complex Systems, CeMSIIS, Medical University of Vienna,Vienna, Austria;Complexity Science Hub Vienna, Josefst¨adterstrasse 39, 1080 Vienna, Austria;Faculty of Nuclear Sciences and Physical Engineering, Czech Technical University inPrague, Bˇrehov´a 7, 115 19 Prague, Czech Republic
S. Gupta
Department of Physics, Ramakrishna Mission Vivekananda Educational andResearch Institute, Belur Math, India
A.M. Scarfone
Istituto dei Sistemi Complessi (ISC-CNR) c/o Politecnico di TorinoCorso Duca degli Abruzzi 24, 10129 Torino, Italy.E-mail: [email protected]
Abstract.
The aim of this focus letter is to present a comprehensive classificationof the main entropic forms introduced in the last fifty years in the framework ofstatistical physics and information theory. Most of them can be grouped into threefamilies, characterized by two-deformation parameters, introduced respectively bySharma, Taneja, and Mittal (entropies of degree ( α, β )), by Sharma and Mittal(entropies of order ( α, β )), and by Hanel and Thurner (entropies of class ( c, d )).Many entropic forms examined will be characterized systematically by means ofimportant concepts such as their axiomatic foundations `a la
Shannon-Khinchin andthe consequent composability rule for statistically independent systems. Other criticalaspects related to the Lesche stability of information measures and their consistencywith the Shore-Johnson axioms will be briefly discussed on a general ground.PACS numbers: 05.20.-y, 89.70.+c, 05.90.+m n overview of generalized entropic forms
1. Historical introduction
The history of entropy begins around the nineteenth century in the then-emergingthermodynamics theory following the studies of Carnot aimed at the attempt to optimizethe efficiency of the conversion of heat into mechanical work. This concept wasformalized by Clausius [1] which introduces the word entropy , whose meaning derivesfrom “transformation produced from within”, a physical quantity whose variation isdefined in dS = Z δQ rev T , for any reversible thermodynamic transformation. The function S , implicitly introducedin this way, is named thermodynamic entropy and the validity of this relation has neverbeen questioned.In the same period, Boltzmann began the development of the kinetic theory ofgas by introducing the idea of monads in a modern key and by highlighting in thisway the necessity of employing statistical methods in physics. Boltzmann studies onthe approach to equilibrium of a system deal with the introduction of the so-called H -functional H [ f ] = − Z f ( v , t ) ln ( f ( v , t )) d v , for a single-particle probability distribution function f ( v , t ) that, together with therelation dH/dt ≥
0, states the celebrated H -theorem [2].Boltzmann results have then been generalized by Gibbs [3] to the case of a canonicalensemble, i.e., a collection of W microstates with a discrete set of energy levels E i belonging to the same macrostate E = P i E i p i , where p i is the probability thatoccurs during the system’s fluctuations. In this context, he introduced the well-knownexpression S [ p ] = − k B W X i =1 p i ln( p i ) , (1)with k B the Boltzmann constant, today known as Boltzmann-Gibbs entropy, recognizedin statistical mechanics to measure the microscopic disorder or randomness of a systemwith a large number of constituents. The legitimacy of this last statement finds validityin the expression S = k B ln W , written explicitly in this form by Planck [4] during his studies on the black-bodyradiation. Nowadays, this last relation is known as the Boltzmann-Plank formula ofentropy and it is a pillar in the framework of thermostatistics.Roughly half-century after Boltzmann-Gibbs developments, entropy has beenfurther conceptualized by Shannon [5] who, to quantify the information carried by a n overview of generalized entropic forms H [ p ] = − K W X i =1 p i ln( p i ) , (2)where the constant K is fixed by the choice of the measure unity, being p i the probabilitythat the i -th symbol of W symbols alphabet has to occur. The functional (2) is named entropy information in analogy with expression (1) and is recognized in informationtheory as a quantity that measures the uncertainty contained in an encoded message.Shannon entropy has been systematically characterized by Shannon himself andsuccessively by Khinchin [6] through the introduction of four basic requirements,nowadays known as the Shannon-Khinchin axioms, which fix univocally the expressionof the information functional.About ten years after the appearance of the Shannon entropy, by replacing the standardlinear average with the nonlinear (o quasi-linear) average introduced by Kolmogorov andNagumo, [7, 8] R´enyi [9] proposed the α -order entropic form S α [ p ] = 11 − α ln W X i =1 p αi , (3)a generalization of (2) which is recovered in the α → S β [ p ] = 11 − β W X i =1 p βi − ! , (4)for a real deformation parameter β >
0, has been introduced in the late sixties of thetwentieth century.Near twenty years later, this entropic form has been employed in statistical physics [12]to obtain an alternative formulation of classical statistical mechanics, a new border-lineresearch field in statistical physics named nonextensive statistical mechanics . One ofthe main reasons to replace the Shannon-Boltzmann-Gibbs entropy (in the following,Shannon entropy) with its generalized version, although not fully accepted by thestatistical physics community, is motivated by the loss of ergodicity observed in complexsystems, often not at the thermodynamical limit, governed by strong interactions andcorrelations, which show statistical properties that are hardly captured by the orthodoxstatistical mechanics theory.Today, entropy is undoubtedly one of the most general and important concepts instatistical physics and information theory. It appears with different meanings in differentfields. Many of the generalized expressions found interesting applications in codingtheory, cryptography, statistical inference theory, non-ergodic systems, fractal dynamics,stochastic thermodynamics, complex systems, and others. As for its importance, entropyis the protagonist of the second law of thermodynamics associated with the arrow of n overview of generalized entropic forms
2. A galore of generalized entropic forms
As known, the Shannon entropy is characterized by a set of four axioms that univocallydefine its form [5, 6]. The Shannon-Khinchin (SK) axioms read as: • A1: Entropy must be an analytically continue function depending only on theprobability [ p ] = ( p , p , . . . , p W ). • A2: Entropy must be maximal for uniform distribution [ p ] = (1 /W, /W, . . . , /W ). • A3: Entropy must be invariant under the inclusion of null events with zeroprobability. • A4: Entropy must be strongly additive under the composition of subsystems, thatis S ( A ∩ B ) = S ( A ) + S ( B/A ) . Among these, the last one is the most relevant to fix the entropy form. In fact, let p ij the joint probability distribution of the composed system A ∩ B , p i = P j p ij themarginal probability distribution of the system A and p j = P i p ij that of the system B then, for a trace-form entropy S = P i s ( p i ), axiom (A4) becomes S ( A ∩ B ) = S ( A ) + W X i =1 p i S ( B/A i ) , (5)where S ( A ∩ B ) = P ij s ( p ij ) and S ( B/A i ) = P j s ( p ij /p i ).Together with the other axioms, (5) has the unique solution given by s ( x ) = x ln(1 /x ),modulo a multiplicative constant. In this way, Shannon entropy (2) [or (1)] is obtained.In particular, for statistically independent (SI) systems, axiom (A4) simplifies in S ( A ∩ B ) = S ( A ) + S ( B ) , (6)which states the additivity property of the Boltzmann-Gibbs entropy (and in a certainsense its extensivity). In presence of correlations, (5) can be relaxed in a way that allows the introduction ofother possible expressions for the entropic functional. In [13] it has been advanced thefollowing relation in place of (5) S ( A ∩ B ) = S ( A ) + W X i =1 p βi S ( B/A i ) , (7)that, together with the other axioms, has the unique solution s ( x ) = x ln β (1 /x ), whereln β ( x ) = x − β − − β , (8) n overview of generalized entropic forms β , suchthat it reduces to the standard logarithm in the β → ( x ) ≡ ln( x ). Thecorresponding entropic form coincides with (4).For SI systems (7) becomes S ( A ∩ B ) = S ( A ) + S ( B ) + (1 − β ) S ( A ) S ( B ) , (9)stating, in this case, the nonadditivity of entropy (4) and it has been one of the mainreasons that led to calling non-extensive statistical mechanics the physical theory basedon the entropic form (4). A step further in generalizing axiom (A4) has been proposed in [14] and reads S ( A ∩ B ) = W X i =1 s ( p i ) W i X j =1 (cid:18) p ij p i (cid:19) β + W X i =1 p αi S ( B/A i ) , (10)that, together with the other axioms has the unique solution s ( x ) = x ln α,β (1 /x ), whereln α,β ( x ) = x − β − x − α α − β , is another generalized version of logarithm by means of two deformation parameters α and β . It reduces to the deformed logarithm (8) in the α → ,β ( x ) ≡ ln β ( x )and to the standard logarithm in the ( α, β ) → (1 ,
1) limit: ln , ( x ) ≡ ln( x ). Theresulting entropic form S α, β [ p ] = W X i =1 p βi − p αi α − β . (11)has been introduced independently by Sharma and Taneja [15], and Mittal [16], as theunique solution of relation S ( A ∩ B ) = S ( A ) W ′ X j =1 p βj + W X i =1 p αi S ( B ) , which follows from (10) for SI systems. In this case it dictates the composition law ofentropy (11).Sharma-Taneja-Mittal entropy, also named entropies of degree ( α, β ), capturessome interesting one-parameter entropic forms obtained by fixing opportunely theparameters α and β , as reported in Table 1. In general, trace-form entropies like the ones introduced above can be viewed as linearaverage of an appropriate Hartley function I ( x ) representing the elementary informationgained, according to S [ p ] = E lin ( I [ p ]) , (12) n overview of generalized entropic forms Table 1.
Entropic forms of degree ( α, β ) Parameters Entropy Ref. α = 1 , β = 1 − P i p i ln( p i ) [5] α = 1 P i p βi − − β [10, 11, 12] α = 1 + κ, β = 1 − κ − P i p κi − p − κi κ [17] α = β − P i p αi ln( p i ) [18] α = q, β = 1 /q − P i p qi − p /qi q − /q [19]with E lin ( x ) = P i x i p i . In particular, the family of Sharma-Taneja-Mital entropiesfollows from I ( x ) = ln α,β (1 /x ).A different approach to derive generalized entropies can be obtained following R´enyi.In the seminal work [9], searching for the most general expression of a functional thatsatisfies axioms (A1)-(A3) and the more soft composability condition given by (6), hereplaced (12) with the quasi-linear average S [ p ] = E KN ( I [ p ]) , (13)introduced by Kolmogorov-Nagumo, where E KN ( x ) = f − ( E lin ( f ( x ))) for an arbitrarystrictly monotonic and continuous function f ( x ).R´enyi entropy follows for f ( x ) = ln α ( e x ) with I ( x ) = ln(1 /x ), which gives the entropyof order α given in (3).Actually, R´enyi entropy can be derived from SK axioms by posing in (A4) S ( B/A ) = f − P Wi =1 p αi f ( S ( B/A i )) P Wi =1 p αi ! . as shown in [20], and (6) is a direct consequence for SI systems. By following the same road, we can introduce more general expressions for a differentchoice of f ( x ) and/or I ( x ), and by replacing the linear composability condition (6) withthe nonlinear one given in (9).A possibility follows by posing f ( x ) = ln α ( exp β ( x )) and I ( x ) = ln β (1 /x ), whereexp β ( x ) = [1 + (1 − β ) x ] − β , is the inverse function of ln β ( x ). In this way we obtain the two-parameters entropy S α,β [ p ] = 11 − β W X i =1 p αi ! − β − α − , (14) n overview of generalized entropic forms α, β ).Quite interesting, (14) follows from the SK axioms by replacing (A4) with the relation S ( A ∩ B ) = S ( A ) + W X i =1 p αi ! − β − α S ( B/A ) , where the conditional entropy S ( B/A ) is now defined in S ( B/A ) = f − P Wi =1 p αi f ( S ( B/A i )) P Wi =1 p αi ! , so that (9) is recovered in the case of SI systems.Again, several entropic forms introduced in literature in different contexts belong to theSharma-Mittal family as shown in Table 2. Table 2.
Entropic forms of order ( α, β ) Parameters Entropy Ref. α = 1 , β = 1 − P i p i ln( p i ) [5] β = 1 − α ln P i p αi [9] α = β P i p βi − − β [10, 11, 12] α = 1 − β (cid:2) e ( β − P i p i ln( p i ) − (cid:3) [21] α = r − m + 1 , β = 1 m − r ln (cid:0)P i p r − m +1 i (cid:1) [22] α = 1 /t, β = 2 − t t − (cid:20)(cid:16)P i p /ti (cid:17) t − (cid:21) [23] α = 1 /β − β (cid:20)(cid:16)P i p /βi (cid:17) − β − (cid:21) [24] α = 2 − β − α (cid:16) − P i p αi (cid:17) [25] By using the so-called escort average instead of other average prescriptions we can obtainnew families of entropic forms defined as certain average of a given information function.They can formally be written in S [ p ] = E ϕ ( I [ p ]) , (15)or also S [ p ] = E ϕ KN ( I [ p ]) , (16)where E ϕ ( x ) = P Wi =1 x i ϕ ( p i ) P Wi =1 ϕ ( p i ) , n overview of generalized entropic forms E ϕ KN ( x ) = f − ( E ϕ ( f ( x ))) , for a given function ϕ ( x ).Clearly definitions (15) is a special case of (16) obtained for f ( x ) = x and, more ingeneral, (12) and (13) follow from (15) and (16) for ϕ ( x ) = x , respectively.Posed ϕ ( x ) = x q , a choice often employed in certain versions of the nonextensivestatistical mechanics , and using I ( x ) = x ln α,β (1 /x ) in (15) or I ( x ) = ln β (1 /x ) and f ( x ) = ln β (exp α ( x )) in (16), we obtain several known entropies reported in Table 3,some of them belong also to the family of order ( α, β ). Table 3.
Entropic forms as averages of information
Parameters Entropy Ref. α = 2 − /β, q = 1 /β − β (cid:20)(cid:16)P i p /βi (cid:17) − β − (cid:21) [24] α = 2 − q, β = 2 − q − q (cid:16) − P i p qi (cid:17) [25] α = 1 , β = 1 − P i p q ln( p i ) P i p qi [26] α = r − q + 1 , β = 1 q − r ln (cid:16) P i p r P i p qi (cid:17) [26] α = 1 , β = q − q e ( q − P i pqi ln( pi ) P i pqi − ! [27] β = 1 , q = s i − α ln (cid:16) P i p α + si − P i p sii (cid:17) [28]In addition, if I ( x ) = h ( − ln ( x )) and f ( h ( x )) = exp α ( x ), for an increasing,continuous function such that h (0) = 0, (16) reduces to the class of strongly pseudo-additive entropies h ( S α [ p ]) introduced from generalized Shannon-Khinchin axioms in[29] and considered latter in [30] under the name of Z -entropies.Furthermore, if entropy and information content in (16) decompose according to thesame pseudoadditivity rule S ( A ∩ B ) = h ( h − ( S ( A )) + h − ( S ( B )) , (17) I ( x y ) = h ( h − ( x ) + h − ( y )) , then we obtain the class of weakly pseudo-additive entropies introduced in [31] whichcontains a number of previously listed entropic forms (see Table I in [31]).It is worthy to cite a more general approach proposed in[32], where it is suggested toreplace axiom (A4) with the only composability rule S ( A ∩ B ) = Φ( S ( A ) , S ( B )) for agiven function Φ( x, y ) that is symmetric Φ( x, y ) = Φ( y, x ), associative Φ( x, Φ( y, z )) =Φ(Φ( x, y ) , z ) and admits a null element Φ( x,
0) = Φ(0 , x ) = x . In this way, foropportunely choosen functions Φ( x, y ) a wide class of generalized entropic forms canbe obtained. Clearly, relation (17) implies the existence of an underlying algebraicstructure that, under certain assumptions, can be derived starting from the expressionof the entropy itself [33]. n overview of generalized entropic forms In [34], by relaxing completely axiom (A4) and following scaling argumentations forthe asymptotic behavior of the entropy summarized in S ( λ W ) /S ( W ) ∼ λ c and W a ( c − S ( W a ) /S ( W ) ∼ (1 + a ) d , for W → ∞ , a new family of two-parametersentropies has been proposed S c,d [ p ] = e P Wi =1 Γ(1 + d, − c ln( p i )) − c − c + c d . (18)The pair of numbers ( c, d ), that characterizes the asymptotic scaling behavior of entropy,univocally defines an equivalent class of entropy in the thermodynamic limit.Once mores, several generalized entropies, obtained independently in other contexts ofstatistical physics, have asymptotical scaling that can be found inside to the S c,d familyfor a particular value of the scaling parameters, as reported in Table 4. Table 4.
Entropic forms of class ( c, d ) Parameters Entropy Ref. c = 1 , d = 1 − P i p i ln( p i ) [5] c = β, d = 0 − P i p βi − − β [10, 11, 12] c = 1 − κ, d = 0 − P i p κi − p − κi κ [17] c = α, d = 1 − P i p αi ln( p i ) [18] c = 1 , d = β P i p i ( − ln( p i )) β [35] c = α, d = β P i p αi ( − ln( p i )) β [36] c = 1 , d = 1 ± P i (cid:0) − p ± p i i (cid:1) [37] c = 1 , d = 0 P i p i (cid:0) − e − /p i (cid:1) [38] c = 1 , d = 0 P i (cid:0) − e − b p i (cid:1) + e − b − c = r, d = 0 r − h − P i e r W ( p pii − ) i [40] c = 1 , d = 1 /η P i Γ (cid:16) η , − ln( p i ) (cid:17) − Γ (cid:16) η (cid:17) [41]Entropic forms of class ( c, d ), as well as those of degree ( α, β ), take into accounta sub-exponential asymptotic behaviour of the system where the number of possibleconfigurations W grows according to a certain power law of the system sized N .However, complex systems may also be characterized by a super-exponential asymptotictrend. In this case, a statistical description based on entropic forms (11) or (18) failsto make a correct prediction. To overcame this lack, in [42, 43] a generalization of (18)has been advanced S [ p ] = W X i =1 p i Z ln c,d ( µ l ( x )) dx , n overview of generalized entropic forms c,d ( x ) = r " x c (cid:18) − c rd r ln( x ) (cid:19) d − , and µ l ( x ) is the neasted logarithm defined in µ l ( x ) = [1 + ln] ( l ) ( x ). The l = 0 casereproduces the entropic forms of class ( c, d ).
3. Final comments
On the basis of the previous analysis, it emerges that most of the entropic formsintroduced in the literature can be grouped into two large groups.The first group is formed by the trace-form entropies. This group includes the entropiesof degree ( α, β ) and the entropies of class ( c, d ), obtained starting from certainconsiderations on the decomposition rule or on the asymptotic behavior of entropyin the thermodynamics limit. Clearly, the ( α, β ) or ( c, d ) families are not exhaustivein this group and many other trace-form entropies, not yet characterized at all, canbe found inside the literature relevant to physics or statistics. For instance, in [44],in the framework of the basic algebra it has been proposed the following entropy S q [ p ] = − P Wi =1 p i Ln q ( p i ), where Ln q ( x ) is the inverse function of the well-known basicexponential, which does not found collocation in the two main families discussed in thisreview.For sake of completeness, trace-form entropies including several examples not listed inthe previous tables are reported in Table 5.The second group is given by the kernels-form entropies, which is obtained startingfrom certain considerations on the average prescription of the information content. Thisgroup includes the entropies of order ( α, β ) as well as the strongly- and weakly-pseudoadditive entropies.In general, kernel-entropies are expressed by a given analytical composition of differentkernel-blocks, each one formed by a certain function of the information content. Theeasiest case is given by the ( h, Φ)-entropies [54] defined in S [ p ] = h W X i =1 Φ( p i ) ! , (19)with a single kernel-block given by the quantity P i Φ( p i ).Clearly, (19) is completely equivalent, in form, to (13) which follows for h ( x ) = f − ( x )and Φ( x ) ≡ x f ( I ( x )). However, while for certain “exotic” entropies the pair of functions( h, Φ) is readily determinable, not so immediate is derive the corresponding pair offunctions ( f, I ).More general is the class of entropies proposed in [55] S [ p ] = h P Wi =1 Φ ( p i ) P Wi =1 Φ ( p i ) ! , n overview of generalized entropic forms Table 5.
Trace-form entropies
Entropy Ref. − P i p i ln( p i ) [5] − P i p αi − p βi α − β [15, 16] − c + c d ( P i e Γ(1 + d, − c ln( p i )) − c ) [34] q P i p i arctan( p q/ i ) − πq [38] P i R p i r h µ l ( x ) c (cid:0) − c rd r ln( µ l ( x )) (cid:1) d − i dx [42, 43] − P i p i Ln q ( p i ) [44] − s ) P i p ri sin( s ln( p i )) [45] − P i p i ln (cid:16) sin( s p i )2 sin( s/ (cid:17) [46] − P i sin( s p i )2 sin( s/ ln (cid:16) sin( s p i )2 sin( s/ (cid:17) [46] P i sin( s p i )2 sin( s/ [46] − λ P i (1 + λ p i ) ln(1 + λ p i ) + (cid:0) λ (cid:1) ln(1 + λ ) [47] − P i (cid:0) p i ln( p i ) + ( λ + p i ) ln(1 + λ p i ) (cid:1) + (cid:0) λ (cid:1) ln(1 + λ ) [48] P i ( p i + ln (2 − p p i i )) [49] − P i ln (Γ(1 + p i )) [50] − q ′ P i p i (cid:20) e − q ′ − q ( p q − i − − (cid:21) [51] P i p i ( − ln β p i ) δ [52] P i p i ( p ri − − a ( p ri − ±√ a +4 b ( p ri +1) [53]which is a generalized kernel-entropy with two kernel-blocks. It includes entropic forms(15) and (16), as particular cases.Examples of kernel-like entropies are showed in Table 6.In conclusion, it is worthy to observe that, in general, entropy must necessarily torespect further additional criteria that may pose several restrictions to the form of thefunctional S [ p ].For instance, the Lesche inequality [59], a necessary requirement that an entropicfunctional must satisfy to make physical sense. Shortly, it requires that a smallperturbation of the set of probabilities to a new set [ p ] → [ p ′ ] should have only asmall effect on the value of entropy reported to the thermodynamic state of the uniformdistribution, i.e. P i | p i − p ′ j | ≤ δ ⇒ | S [ p ] − S [ p ′ ] | S max ≤ ǫ .This should, in particular, be true in the thermodynamical limit W → ∞ .It is known that trace-form entropies like the Sharma-Taneja-Mittal family or the Hanel-Turner family pass the Lesche inequality [60, 34] while the question turns out to bemore problematic for the Sharma-Mittal family since some of its members, like R´enyiand others, seems to be not Lesche stable [61], although this problem has still to befully clarified [62, 63]. n overview of generalized entropic forms Table 6.
Kernel-like entropic forms
Entropy Ref. − s h ( P i p ri ) − s − r − i [21] s arctan (cid:18) P i p ri sin ( s ln( p i ) ) P i p ri cos ( s ln( p i ) ) (cid:19) [26]exp ( P i ln (2 − p p i i )) [49] − s n(cid:2) − r − s ln ( P i p si ) (cid:3) − s − r − o [56]exp (cid:16) L (cid:16) ln P i p ri γ (1 − r ) (cid:17) − (cid:17) [57] r − s log (cid:18) ( P i p si ) r ( P i p ri ) s (cid:19) [58] r − s (cid:20) ( P i p si ) r ( P i p ri ) s − (cid:21) [58]A further relationships can be related to the Shore-Johnson axioms [64] that,differently from the SK axioms, routed to the information theory, concern the statisticalestimation theory and seem to pose stringent limitations to the form of entropy mayhave.The question is strictly related to the maximal entropy principle introduced in [65],which is the main bridge between information theory, statistical physics and statisticalinference. It is a powerful method widely employed in statistical sciences to derive theprobability distribution of a system described by a given entropy, subjected by certainconstraints given by the prior information on the system itself.With the introduction of new entropic forms, it has been natural to extend the maximalentropy principle in these cases, to obtain distributions different from Boltzmann-Gibbsones. However, several criticisms on the consistency of the maximal entropy principlewith generalized entropic forms have been recently advanced [66] since in the originalpaper, Shore and Johnson conclude that their axioms yield only one admissible measure,namely Shannon entropy.It has been shown in [67] that the Shore-Johnson axiomatization of the inference ruleactually does account for a substantially wider class of entropic functionals than justthe Shannon entropy. In particular, at least the Uffink class of entropies [68] S [ p ] = f − W X i =1 p αi ! / (1 − α ) , which corresponds to the strong pseudoadditive entropy presented above, is compatiblewith the conditions stated in the Shore-Johnson axioms. In [69] the Uffink class of en-tropic functional has been characterized by means of a suitable generalization of the SKaxioms, re-establishing, in this way, in part, the “broken” entropic parallelism betweeninformation theory and statistical inference. n overview of generalized entropic forms Acknowledgments
V.I. was supported by the Serbian Ministry of Education, Science and TechnologicalDevelopment through the Mathematical Institute of the Serbian Academy of Sciencesand Arts.J.K. acknowledges support by the Czech Science Foundation (GA ˇCR), Grant No.19-16066S. and by the Austrian Science Fund (FWF) under Project no. I3073.
References [1] Clausius R.,
The Mechanical Theory of Heat: With Its Application to the Steam Engine and tothe Physical Properties of Bodies (J. Van Voorst, London, UK) 1867.[2] Boltzmann L.,
Sitz. Ber. Akad. Wiss. Wien (II) , (1872) 275; (1877) 67.[3] Gibbs J.W., Elementary Principles in Statistical Mechanics (Charles Scribner’s Sons, New York,US) 1902.[4] Planck, M.,
Ann. Phy. , (1901) 553.[5] Shannon C.E., Bell Syst. Techn. J. , (1948) 379.[6] Khinchin A.I., Mathematical Foundations of Information Theory (Dover Pub., New York, US)1957.[7] Kolmogorov A.N.,
Atti R. Accad. Naz. Lincei , (1930) 388.[8] Nagumo M., Jpn. J. Math. (1930) 71.[9] R´enyi A., Proc. Fourth Berk. Symp. Math. Stat. Prob. , (1961) 547.[10] Havrda J. and Charv´at F., Kybernetika , (1967) 30.[11] Dar´oczy Z., Inf. Contr. , (1970) 36.[12] Tsallis C., J. Stat. Phys. , (1988) 479.[13] Suyari H., IEEE Trans. Inf. Theor. , (2004) 1783.[14] Wada T. and Suyari H., Phys. Lett. A , (2007) 199.[15] Sharma B.D. and Taneja I.J., Metrica , (1975) 205.[16] Mittal P.D., Metrika (1975) 35.[17] Kaniadakis G., Physica A , (2001) 405.[18] Shafee F., IMA J. Appl. Math. , (2007) 785.[19] Abe S., Phys. Lett. A , (1997) 326.[20] Jizba P. and Arimitsu T., Ann. Phys. , (2004) 17.[21] Sharma B.D. and Mittal D.P., J. Math. Sci. , (1975) 122.[22] Varma R.S., J. Math. Sci. , (1966) 34.[23] Arimoto S., Inf. Cont. , (1971) 181.[24] Tsallis C., Mendes R.S. and Plastino A.R., Physica A , (1998) 534.[25] Landsberg P.T. and Vedral V., Phys. Lett. A (1998) 211.[26] Acz´el J. and Dar´oczy Z.,
Pub. Math. , (1963) 171.[27] Jizba P and Korbel J., Physica A , (2016) 808.[28] Rathie P.N., J. Appl. Probl. , (1970) 124.[29] Ili´c V.M. and Stankovi´c M.S., Physica A , (2014) 138.[30] Tempesta P., Proc. R. Soc. A: Math. Phys. Eng. Sci. , (2016) 20160143.[31] Ili´c V.M. and Stankovi´c M.S., Physica A , (2014) 229.[32] Tempesta P. and Jensen H.J., Sci. Rep. , (2020), 5951.[33] Scarfone A.M., Entropy , (2013) 624.[34] Hanel R. and Thurner S., Eur. Phys. Lett. , (2011) 20006.[35] Ubriaco M.R., Phys. Lett. A , (2009) 2516. n overview of generalized entropic forms [36] Radhakrishnan C., Chinnarasu R. and Jambulingam S., Int. J. Stat. Mech. , (2014) 460364.[37] Obreg´on O., Entropy , (2010) 2067.[38] Tsekouras G.A. And Tsallis C., Phys. Rev. E, (2005) 046144.[39] Curado E.M.F. and Nobre F.D., Physica A , (2004) 94.[40] Bizet N.C., Fuentes J. and Obreg´on O., Eur. Phys. Lett. , (2019) 60004.[41] Anteneodo C. and Plastino A.R., J. Phys. A: Math. Gen. , (1999) 1089.[42] Korbel J., Hanel R. and Thurner S., New J. Phys. (2018) 093007.[43] Korbel J., Hanel R. and Thurner S., Eur. Phys. J. ST , , (2020) 787.[44] Lavagno A., Scarfone A.M. and Swamy P.N., J. Phys. A: Math. Theo. , (2007) 8635.[45] Sharma B.D. and Taneja I.J., Elec. Inform. Kybern. , (1977) 419.[46] Sant’anna A.P. and Taneja I.J., Inf. Sci , (1985) 145.[47] Ferrari C., Statistica , (1980) 155.[48] Kaniadakis G., Lavagno A. and Quarati P., Nucl. Phys. B , (1996) 527.[49] Amig´o J.M., Balogh S.G. and Hern´andez S., Entropy , (2018) 813.[50] Kapur J.N., Boll. U.M.I. ,
7, 2-B (1988) 253.[51] Schw¨ammle V. and Tsallis C.,
J. Math. Phys. , (2007) 113301.[52] Tsallis C. and Cirto L.J.L., Eur. Phys. J. C , (2013) 2487.[53] Curado E.M.F., Tempesta P. and Tsallis C., Ann. Phys. , (2016) 22.[54] Salicr´u M, Men´endez M.L., Morales D. and Pardo L., Comm. Stat. , (1993) 2015.[55] Esteban M.D. and Morales D., Kybernetika , (1995) 337.[56] Masi M., Phys. Lett. A , (2005) 217.[57] Jensen H.J., Pazuki R.H., Pruessner G, and Tempesta P., J. Phys. A: Math. Gen. , (2018)375002.[58] Bercher J.-F., Phys. Lett. A , (2011) 2969.[59] Lesche B., J. Stat. Phys. , (1982) 419.[60] Kaniadakis G., Lissia M. and Scarfone A.M., Phys. Rev. E , (2005) 046128.[61] Tsallis C. and Brigatti E., Cont. Mech. Thermodyn. , (2004) 223.[62] Jizba P. and Arimitsu T., Phys. Rev. E , (2004) 026128.[63] Bashkirov A.G., Phys. Rev. E , (2005) 028101.[64] Shore J.E. and Johnson R.W., IEEE Trans. Inf. Theor. , (1980) 26.[65] Jaynes F.T., Phys. Rev. , (1957) 171.[66] Press´e S., Ghosh K., Lee J. and Dill K.A., Phys. Rev. Lett. , (2013) 180604.[67] Jizba P. and Korbel J., Phys. Rev. Lett. , (2019) 120601.[68] Uffink J., Stud. Hist. Phil. Mod. Phys. , (1995) 223.[69] Jizba P. and Korbel J., Phys. Rev. E ,101