Beyond the Shannon-Khinchin Formulation: The Composability Axiom and the Universal Group Entropy
aa r X i v : . [ m a t h - ph ] O c t Beyond the Shannon-Khinchin Formulation: TheComposability Axiom and theUniversal Group Entropy
Piergiulio Tempesta a,b a Departamento de F´ısica Te´orica II (M´etodos Matem´aticos de la f´ısica), Facultad deF´ısicas, Universidad Complutense de Madrid, 28040 – Madrid, Spainand b Instituto de Ciencias Matem´aticas, C/ Nicol´as Cabrera, No 13–15, 28049 Madrid, Spain
Abstract
The notion of entropy is ubiquitous both in natural and social sciences. In thelast two decades, a considerable effort has been devoted to the study of newentropic forms, which generalize the standard Boltzmann-Gibbs (BG) entropyand are widely applicable in thermodynamics, quantum mechanics and informa-tion theory. In [25], by extending previous ideas of Shannon [40, 41], Khinchinproposed a characterization of the BG entropy, based on four requirements,nowadays known as the Shannon-Khinchin (SK) axioms.The purpose of this paper is twofold. First, we show that there exists an in-trinsic group-theoretical structure behind the notion of entropy. It comes fromthe requirement of composability of an entropy with respect to the union oftwo statistically independent systems, that we propose in an axiomatic formu-lation. Second, we show that there exists a simple universal family of trace-formentropies. This class contains many well known examples of entropies and in-finitely many new ones, a priori multi-parametric. Due to its specific relationwith Lazard’s universal formal group of algebraic topology, the new general en-tropy introduced in this work will be called the universal-group entropy. A newexample of multi-parametric entropy is explicitly constructed.
Email address: [email protected], [email protected] (PiergiulioTempesta) ontents1 Introduction 22 The group-theoretical content of the notion of entropy 5 S U entropy 10 S q,α,β entropy 187 Distribution functions and thermodynamic properties 198 On the asymptotic behaviour of generalized entropies 209 Open problems and future perspectives 21Appendix A The Shannon-Khinchin axioms 211. Introduction Entropy is a fundamental notion, at the heart of modern science. In the sec-ond half of the twentieth century, its range of applicability has been extendedfrom the traditional context of classical thermodynamics to new areas such so-cial sciences, economics, biology, quantum information theory, linguistics, etc.More recently, the role of entropy in the theory of complex systems has beenactively investigated. From one side, several studies were devoted to axiomaticformulations, aiming at clarifying the foundational aspects of the notion of en-tropy. From the other side, many researchers pursued the idea of generalizingthe classical Boltzmann-Gibbs statistical mechanics. Consequently, a plethoraof new entropic forms, designed for extending the applicability of BG entropyto new contexts, was introduced.The first research line was started by the seminal works by Shannon [40, 41]and Khinchin [25]. A set of axioms, nowadays called the SK axioms, character-izing uniquely the BG entropy, was introduced (we shall make reference to theformulation of the axioms reported in Appendix A). The axioms (SK1)-(SK3)2epresent natural requirements (continuity, maximum principle, independencefrom zero probability events), that should be satisfied by any functional playingthe role of an entropy. Instead, the axiom (SK4) simply characterizes the be-haviour of an entropy with respect to the composition of two subsystems, whichreduces to additivity in the case of statistical independence of the subsystems.For long time, additivity was interpreted as the property that ensures ex-tensivity , i.e. the linear dependence of entropy on the number of particles ofa system. According to Clausius, extensivity is crucial for an entropy to bethermodynamically admissible. Surprisingly, the two concepts are completelyindependent: additivity does not imply, nor is implied by extensivity. In addi-tion, no entropy, irrespectively of being additive or nonadditive, can be extensivein any dynamical regime. For instance, if W ( N ) is the total number of states ofa complex system as a function of the number of its particles N , it turns out thata (sufficient) condition for the BG entropy to be extensive is that W ( N ) ∼ k N ,with k ∈ R + ; however, if W ( N ) ∼ N k , it is not.The second research line, i.e. the study of generalized entropies and ther-mostatistics, in which the additivity axiom is explicitly violated, has becomean extremely active research area in the last three decades. Since the work ofTsallis [50], many new entropic functionals have been proposed in the literature(see e.g. [1], [4], [6], [9], [15], [17, 18], [31], [32], [39], [45], [53]). From the pointof view of statistical mechanics, they may generalize the BG entropy in weaklychaotic regimes, when the ergodicity hypothesis is violated and the correlationfunctions exhibit a non-exponential decay, typically a power-law one. [49]. Inparticular, Tsallis entropy, which is nonadditive, is extensive for special valuesof the parameter q in regimes where the BG entropy is not [54].Another source of nonadditive entropies is Information Theory. In this con-text, generalized entropies can provide more refined versions of Kullback-Leibler-type divergences [26], useful for constructing comparative tests of sets of data.In the study of entanglement, nonadditive entropies arise as necessary alter-natives to the von Neumann entropy [56], [15]. As usual, a system composedby two subsystems A and B is said to be entangled or separated if its densityoperator ρ cannot be written as a convex combination of uncorrelated densi-ties, i.e. ρ = P k q k ρ Ak ⊗ ρ Bk . As shown in [21], the direct maximization ofthe von Neumann entropy S = − Tr ρ ln ρ does not avoid the detection of fakeentanglement even for the case of two spin systems. The need for general-ized nonadditive entropies in order to design efficient criteria for separabilityhas been advocated in [15]. Recently, the relevance of generalized entropies inprocesses with quantum memory has been recognized [7].In the present analysis of the mathematical foundations of the concept ofentropy, we wish to point out the centrality of the notion of composability . Weshall say that an entropy is composable if the following requirements are satis-fied. First, given two statistically independent systems A and B , the entropyof the composed system A ∪ B depends on the entropies of the two systems S ( A ) and S ( B ) only (apart possibly a set of parameters). This is the originalformulation of the concept, as in [52]. In addition, we require further prop-erties, as the symmetry of a given entropy in the composition of the systems3 and B , the stability of total entropy if one of the two systems is in a stateof zero entropy, and the associativity of the composition of three independentsystems (see axioms (C1)-(C4) below). With these assumptions, the compos-ability property, in this new, broader sense, is equivalent to the existence of a group-theoretical structure underlying the notion of entropy , which guaranteesthe physical plausibility of the composition process.All the previous properties ensure that a given entropy can be expressedjust in terms of macroscopic configurations of a system, without the need fora microscopic description of the associated dynamics. An entropy should be coarse-grained [16]; if not so, the concept of entropy would be simply empty.Even the second law of thermodynamics loses any meaning if not referred tothe evolution of macroscopic subsystems: the entropy would stay invariant ifdefined on microscopic configurations.From the previous discussion, it emerges that keeping only with the firstthree SK axioms is a too weak requirement from a thermodynamical perspective:the fourth axiom cannot be simply dropped away. This situation closely remindsthe role of Euclids’ fifth postulate of geometry. The replacement (not the mereexclusion) of this postulate with different axiomatic formulations paved the wayto non-Euclidean geometries.Our proposal is to assume the property of composability in its completegroup-theoretical formulation as the generalized form of the fourth axiom . Thisassumption is not at all obvious. From this point of view, there are severaldifferent perspectives.a) If we require composability in a strict sense, i.e. for any possible choice ofthe probability distributions of the given systems, it emerges that the Boltzmann-Gibbs entropy and the Tsallis entropy are the only known cases of trace-formcomposable entropies (at the best of our knowledge).b) At the same time, the notion of composability can be formulated in aweak sense. Indeed, we can impose that the composability axiom be satisfiedat least on the uniform distribution . This requirement, in thermodynamics, isa fundamental one. Indeed, the uniform distribution emerges when one dealswith isolated physical systems at the equilibrium (microcanonical ensemble), orin contact with a thermostat at very high temperature (canonical ensemble).For instance, this is the physical situation occurring when considering high-temperature astrophysical objects. It turns out that a large set of entropiesintroduced in the last decades are weakly composable, although not all of them.Again, our formulation unravels the intrinsic group-theoretical content of theconcept of entropy: in all cases, the composability requirement amounts to theexistence of a group law for the composition process, defined either over thewhole set of probability distributions (strict composability) or just over theuniform one (weak composability).In this work, we present two main results. First, we encode composabilityinto a general axiomatic formulation, required for an entropy to be consideredadmissible.Second, we introduce a very general trace-form entropic form. We shall callit the universal-group entropy , due to its relation with Lazard’s universal formal4roup. This entropy offers infinitely new cases of trace-form weakly composableentropies, therefore satisfying the first and third SK axioms. Also, all knownentropies are directly related with our new entropic form.The universal group entropy depends a priori on a (possibly) infinite num-ber of independent parameters. Special attention will be paid to the recentlyintroduced S c,d entropy [17] and to group entropies [45], respectively. We willshow that both cases are related to the universal-group theoretical frameworkin a very direct way. Besides, multi-parametric entropies can be introduced, notrelated to any of the known entropies. As an interesting example, the three-parametric S α,β,q entropy is proposed.The paper is organized as follows. In Section 2, the group structure behindthe notion of entropy is introduced. In Section 3, the theory of formal groups,and in particular that of the universal formal group, is sketched. In Section 4,the universal-group entropy is defined and its main properties are studied inSection 5. In Section 6, a new example of three-parametric entropy is proposed.In Section 7, the Legendre-type structure and some thermodynamical aspectsof the theory are discussed. In Section 8, the concept of thermodynamic limit ofan entropy is discussed. In Section 9, some open problems and possible relatedresearch lines are briefly proposed. In the Appendix A, the formulation of theSK axioms adopted in the paper is reported.We expect that the present general formulation of the notion of entropy,apart its intrinsic theoretical interest, can have further use both in classical andquantum information theory and beyond.
2. The group-theoretical content of the notion of entropy
The purpose of this section is to establish the existence of a group-theoreticalstructure underlying the concept of (generalized) entropy. To this aim, we pro-pose first a new formulation of the notion of composability. We shall assumethat the entropies we consider are sufficiently regular functions S ( p , . . . , p W )defined for any integer W ≥ P W of all discrete probability distribu-tions with W entries, and taking values in R + ∪ { } . Definition 1.
An entropy S is strongly (or strictly) composable if there existsa smooth function of two real variables Φ( x, y ) such that(C1) S ( A ∪ B ) = Φ( S ( A ) , S ( B ); { η } ) (1) where A ⊂ X and B ⊂ X are two statistically independent systems, each definedin terms of an arbitrary probability distribution { p i } Wi =1 , and { η } is a possibleset of real continuous parameters. In addition, Φ( x, y ) satisfies the followingproperties:(C2) Symmetry: Φ( x, y ) = Φ( y, x ); (2)5 C3) Associativity: Φ( x, Φ( y, z )) = Φ(Φ( x, y ) , z ); (3) (C4) Null-composability: Φ( x,
0) = x. (4)The symmetry property is an obvious requirement. The null-composabilityensures that the composition of a system with another system in a state of zeroentropy cannot affect thermodynamics. The associativity property is a new,essential point: it guarantees the composability of more than two systems. Remark 1.
Notice that, assuming Φ( x, y ) = x + y + P kl c kl x k y l , the existenceof a power series ϕ ( x ) such that Φ( x, ϕ ( x )) = 0, i.e. playing the role of aninverse, is a direct consequence of the previous axioms (this observation is alsovalid when working with formal power series only). The previous requirementsamount to say that, from a mathematical point of view, Φ( x, y ) defines a formal group law over the reals. For instance, in the case of the Boltzmann entropy itis nothing but the additive group over R .However, when we restrict to values x, y ∈ R + ∪ { } , as is the case whenΦ( x, y ) is computed over standard entropies, we do not have an inverse. Remark 2.
Condition (1) can be formulated in more general terms. Considerthe case of a composite system A ∪ B arising from two systems not statisti-cally independent, with a conditional probability distribution p ij ( B | A ) := p ij ( A ∪ B ) /p i ( A ). Here p ij ( A ∪ B ), i = 1 , . . . , W A , j = 1 , . . . , W B denotes thejoint probability distribution for the composite system A ∪ B , and p i ( A ) is themarginal probability distribution p i ( A ) = P W B j =1 p ij ( A, B ). In this context wepostulate the relation S ( A ∪ B ) = Φ( S ( A ) , S ( B | A ); { η } ) , (5)where S ( B | A ) denotes the conditional entropy associated with the conditionaldistribution p ij ( B | A ). Equation (5) reduces to the relation (1) in the case ofstatistically independent systems. The relation (5) generalizes the axiom (SK4).We can also propose a weak formulation of composability. Definition 2.
We shall say that an entropy is weakly composable if the proper-ties (C1)–(C3) of Definition 1 are satisfied at least when the probability distri-butions of the two statistically independent systems A and B are both uniform,and property (C4) holds in general. This weak formulation is in fact satisfied by infinitely many generalized en-tropies, as we shall prove. For instance, the function Φ( x, y ), and consequentlyits group structure, can be typically constructed starting from generalized log-arithms. This second point of view has been first proposed in [45] for a classof logarithms coming from difference operators. Again, the weak composabilityrequirement implies the existence of a group law.6 .2. On the relation between the SK axioms, composability and admissible trace-form entropies
The SK axioms and the composability axiom are strictly related. Firstobserve that, for both formulations of composability, the continuity of a givenentropy S with respect to its arguments, as required by the axiom (SK1), impliesthe continuity of Φ( x, y ) at least for x, y ∈ R + ∪ { } .In the strong formulation of the notion of composability, the property (5),valid for the Boltzmann and Tsallis entropies, is equivalent to the axiom (SK4).However, in both formulations, we still have to impose the further requirementof strict concavity (to ensure that axiom (SK2) is fulfilled).Motivated by the previous discussion, we analyze the requisites for an en-tropy to be considered admissible , i.e. relevant both from a physical andinformation-theoretical point of view. In the recent literature, usually admissi-ble entropies are considered to be those satisfying just the axioms (SK1)–(SK3).Our point of view is therefore more demanding: A necessary condition for anentropy S to be admissible is that it satisfies the axioms (SK1)–(SK3) and is(at least) weakly composable. Indeed, there exist entropies which do satisfy the axioms (SK1)–(SK3), butare not even weakly composable. Needless to say, the strong composabilityproperty is much more suitable for thermodynamical purposes than its weakformulation.The general problem of classifying the admissible entropies will be discussedin the forthcoming sections.
3. Formal group laws
The theory of formal groups [8] offers the natural language for formulatingour approach to the theory of generalized entropies. Formal groups have beenintensively investigated in the last decades, especially for their prominent role infields as algebraic topology, the theory of elliptic curves, and arithmetic numbertheory [20], [44]. Here we briefly recall only some salient aspects, necessary inthe subsequent discussion.Let R be a commutative ring with identity, and R { x , x , .. } be the ring offormal power series in the variables x , x , ... with coefficients in R . A commuta-tive one-dimensional formal group law over R [8] is a formal power series in twovariables Ψ ( x, y ) ∈ R { x, y } of the form Ψ ( x, y ) = x + y +terms of higher degree,such that i ) Ψ ( x,
0) = Ψ (0 , x ) = xii ) Ψ (Ψ ( x, y ) , z ) = Ψ ( x, Ψ ( y, z )) .When Ψ ( x, y ) = Ψ ( y, x ), the formal group law is said to be commutative (theexistence of an inverse formal series ϕ ( x ) ∈ R { x } such that Ψ ( x, ϕ ( x )) = 0follows from the previous definition).The simplest examples are the additive formal group law Ψ( x, t ) = x + y and the multiplicative one Ψ( x, y ) = x + y + xy .7he previous definition can be naturally extended to the case of n -dimensionalformal group laws.The relevance of formal groups relies first of all on their close connectionwith group theory. Precisely, a formal group law Ψ( x, y ) defines a functor F : Alg R −→ Group , where
Alg R denotes the category of commutative unitaryalgebras over R and Group denotes the category of groups [20]. The functor F is by definition the formal group (sometimes called the formal group scheme)associated to the formal group law Ψ.As is well known, over a field of characteristic zero, there exists an equiva-lence of categories between Lie algebras and formal groups [38].Any n –dimensional formal group law defines a n –dimensional Lie algebraover the same ring R by means of the identification[ x, y ] = Ψ ( x, y ) − Ψ ( y, x ) , (6)where Ψ ( x, y ) denotes the quadratic part of the formal group law Ψ( x, y ). Thisequivalence of categories is no longer true in a field of characteristic p = 0. The main algebraic structure we need is provided by the following construc-tion. Let B = Z [ b , b , ... ] be the ring of integral polynomials in infinitely manyvariables. We shall consider the series (formal group logarithm) F ( s ) = ∞ X i =0 b i s i +1 i + 1 , (7)with b = 1. Let G ( t ) be its compositional inverse (the formal group exponen-tial): G ( t ) = ∞ X k =0 a k t k +1 k + 1 (8)so that F ( G ( t )) = t . We have a = 1 , a = − b , a = b − b , . . . . The Lazardformal group law [20] is defined by the formal power seriesΦ ( s , s ) = G ( F ( s ) + F ( s )) . It has a great relevance in many branches of mathematics, as algebraic topol-ogy, cobordism theory, etc. [20], [10], [11], [35].The coefficients of the power series G ( F ( s ) + F ( s )) lie in the ring B ⊗ Q and generate over Z a subring A ⊂ B ⊗ Q , called the Lazard ring L .The following important results, due to Lazard, hold. First, for any commu-tative one-dimensional formal group law over any ring R , there exists a uniquehomomorphism L → R under which the Lazard group law is mapped into thegiven group law (the so called universal property of the Lazard group).At the same time, for any commutative one-dimensional formal group lawΨ( x, y ) over any ring R , there exists a series ψ ( x ) ∈ R [[ x ]] ⊗ Q such that ψ ( x ) = x + O ( x ) , and Ψ( x, y ) = ψ − ( ψ ( x ) + ψ ( y )) ∈ R [[ x, y ]] ⊗ Q .
8n [11] and [33], it has been pointed out that the Lazard universal grouplaw is fundamental in the discussion of both the classical and modern theory ofunitary cobordisms. A beautiful connection with combinatorial Hopf algebrasand Rota’s umbral calculus was established in [13]. A combinatorial approachhas been proposed in [5].In the papers [42], [44] a connection between the Lazard universal formalgroup and the theory of Dirichlet series has been established. In this context,the
Universal Bernoulli polynomials were also introduced and their remarkableproperties studied in [46] (see also [43]). In [30], the universal Bernoulli poly-nomials have been related to the theory of hyperfunctions of one variable bymeans of an extension of the classical Lipschitz summation formula to negativepowers.
4. The universal-group entropy.
We are now able to construct a very general family of trace-form entropies withrelevant thermodynamical properties.
Definition 3.
Let { p i } i =1 , ··· ,W , W ≥ , with P Wi =1 p i = 1 , be a discrete proba-bility distribution. Let G ( t ) = ∞ X k =0 a k t k +1 k + 1 (9) be a real analytic function, where { a k } k ∈ N is a sequence of parameters, with a = 0 , such that the function S U : P W → R + ∪ { } , defined by S U ( p , . . . , p W ) := k B W X i =1 p i G (cid:18) ln 1 p i (cid:19) , (10) is a concave one. This function will be called the universal-group entropy. The name of the entropy (10) is obviously reminiscent of its direct connec-tion with Lazard’s construction of a universal formal group law. Indeed, G ( t )is a group exponential. The function G ( t ), or equivalently the real sequence { a k } k ∈ N , encodes all the main features of the entropy (10). A simple way toconstruct an important subclass of concave entropies is provided by the followingobservation. Remark 3.
The condition a k > ( k + 1) a k +1 ∀ k ∈ N with { a k } k ∈ N ≥ G ( t ) is absolutely and uniformly convergentwith a radius r = ∞ and that S U [ p ] is a strictly concave functional (see theproof of Theorem 2). Although certainly restrictive, condition (11) is satisfiedby many of the entropies known in the literature.9 emark 4. The universal-group entropy depends on the infinite set of pa-rameters a k , which are a priori independent, apart for the existence of specificconstraints, as in eq. (11). To recover known cases of one-parametric or two-parametric entropies, depending let’s say on the parameters q and q , we shallhave that a k = f k ( q , q ). Also, notice that the (apparently) more general caseof an entropy of the form S U = k B W X i =1 p i G (cid:18) ln 1 p ic (cid:19) , (12)where c is a positive constant, can be easily recast in the language of Defini-tion 3. Indeed, it is sufficient to consider a formal group exponential e G ( t ) = P ∞ k =0 α k t k +1 k +1 , with α k = a k c k or, equivalently, to require a modified condition(11) of the form a k > c ( k + 1) a k +1 , with { a k } k ∈ N ≥ Remark 5.
The entropy (10) is trace-form . Although very general, the familyof trace-form entropies does not include other functional forms, like Renyi’sentropy [37], very useful in several applications. The fact that the entropy (18)is trace-form, with a group exponential at least piecewise differentiable, ensures
Lesche stability [29], which is a desirable property of any entropic functional,and more generally of any physical observable. Essentially, Lesche stabilityis the property of uniform continuity of an entropic functional in the spaceof probability distributions. It guarantees that a small variation of the setof probabilities produces a small change of entropy: given two distributions { p i } i =1 ,...,W and { p ′ i } i =1 ,...,W whose values are slightly different, ∀ ǫ ∃ δ > s.t. k p − p ′ k < δ = ⇒ (cid:12)(cid:12)(cid:12)(cid:12) S [ p ] − S [ p ′ ] S max (cid:12)(cid:12)(cid:12)(cid:12) < ǫ, (13)where, given a vector x , k x k denotes the l norm and S max denotes the maxi-mum value of the entropy.This requirement, crucial for physical applications, is not so relevant in othercontexts as, for instance, Information Theory.
5. Main properties of S U entropy In the following, we shall focus on the most relevant properties of the entropy(10), directly coming from its group-theoretical formulation, and on its relation-ship with other entropic functionals.
Theorem 1.
The entropy S U [ p ] is weakly composable.Proof. For sake of clarity, we shall formulate the proof in a general setting. Let { p Ai } W A i =1 and { p Bj } W B j =1 two sets of probabilities associated with two statisticallyindependent systems A and B . The joint probability is given by p A ∪ Bij = p Ai · p Bj W AB = W A W B . We have S U ( A ∪ B ) : = k B W A X i =1 W B X j =1 p A ∪ Bij G ln 1 p A ∪ Bij ! == k B W A X i =1 W B X j =1 p Ai · p Bj G ln 1 p Ai + ln 1 p Bj ! == k B W A X i =1 W B X j =1 p Ai · p Bj G ( t + t ) == k B W A X i =1 W B X j =1 p Ai · p Bj G ( F ( s ) + F ( s )) == k B W A X i =1 W B X j =1 p Ai · p Bj Φ ( s , s ) == k B W A X i =1 W B X j =1 p Ai · p Bj s + s + ∞ X k,m =1 c km s k s m == k B W A X i =1 W B X j =1 p Ai · p Bj " G (cid:18) ln 1 p Ai (cid:19) + G ln 1 p Bj ! + . . . . (14)These equalities hold in full generality. Now, consider the Boltzmann and Tsalliscomposition laws, corresponding to the additive case (i.e. c km = 0 ∀ k, m ) andto the case c = 0 and c km = 0 for ( k, m ) = (1 , S U ( A + B ) = Φ ( S U ( A ) , S U ( B )) . (15)Observe that whenever c km = 0 for ( k, m ) = (1 , { p Ai } and { p Bj } are both the uniform distribution ,formula (15) holds for the whole family of entropies represented by (10).The formal power series φ ( x, y ) = G ( F ( x )+ F ( y )) for any choice of G definesa formal group law. It verifies automatically the conditions of symmetry, nullcomposability and transitivity. The thesis follows.In the case of an entropy of the form (12), the same analysis holds. Theorem 2.
The universal group entropy satisfies the first three SK axioms. G ( t ) is a real analytic function of t . Conse-quently, the universal group entropy is (at least) a continuous function of itsarguments ( p , . . . , p w ).(SK2). The entropy S U [ p ] is supposed to be concave by definition. Never-theless, we wish to prove here that condition (11), which allows to construct alarge subclass of entropies of the form (10), is sufficient to guarantee concavity.To this aim, consider the quantity P ∞ k =0 a k k +1 x (cid:0) ln x (cid:1) k +1 . By imposing strictconcavity we get − x { a − a +( a − a ) ln 1 x +( a − a ) (cid:18) ln 1 x (cid:19) +( a − a ) (cid:18) ln 1 x (cid:19) + . . . } < . (16)Condition a k > ( k + 1) a k +1 ensures that the inequality (16) is satisfied. Conse-quently, the associated entropy (10) is strictly concave in its space of parameters.Other choices of the sequence { a k } k ∈ N are clearly possible.(SK3). Since by construction G (0) = 0, and lim x → x (cid:0) ln x (cid:1) k = 0, it followsthat S U (0) = 0. Similarly, S U (1) = 0. We shall study here the extensivity properties of the universal-group en-tropy. Consider the uniform probability distribution (i.e. p i = 1 /W for all i = 1 , . . . , W ). We have that S U [ W ] = k B G (ln W ) ∼ N ⇐⇒ W ( N ) ∼ exp ( F ( N )) . (17)The expression for W ( N ) so determined can be computed explicitly. Indeed, asa formal series F ( s ) is the compositional inverse of G ( t ): it can be constructedby means of the Lagrange inversion principle.From a physical point of view, one should also ensure that W ( N ) be inter-pretable as an occupation law. A sufficient condition is that W ( N ), as a realfunction, be defined for all N ∈ N , with lim N →∞ W ( N ) = ∞ . These require-ments usually restrict the space of allowed parameters. As an example, for thecase of Tsallis entropy we have that the corresponding W ( N ) is a well definedoccupation law for the values of the entropic parameter q < W ( N ) of the form (17), there exists a representation of theuniversal-group entropy which is extensive for all systems whose phase spacegrows according with the selected occupation law. This notable fact ensures thelarge applicability of the entropy (10) in thermodynamical contexts. We shall discuss now how the previous formalism applies to the classificationof known entropies. 12et us first observe that the universal entropy admits the following formaldecomposition S U [ p ] = ∞ X k =1 a ′ k S k [ p ] (18)with a ′ k = a k − /k , in terms of a set of elementary functionals S k [ p ] := k B P Wi =1 p i (cid:16) ln p i (cid:17) k .a) The Boltzmann-Gibbs entropy S B [ p ] = k B W X i =1 p i ln 1 p i (19)is obtained by means of the choice G ( t ) = t , c = 1 (i.e. for a = 1 , a i = 0, ∀ i = 2 , , . . . ).b) The Tsallis entropy [50] S q [ p ] = k B P Wi =1 p qi − − q (20)for q < G ( t ) = exp[(1 − q ) t ] − − q , c = 1. It can be decomposed as S q [ p ] = k B W X i =1 p i { ln 1 p i + 12 (1 − q ) (cid:18) ln 1 p i (cid:19) + 16 (1 − q ) (cid:18) ln 1 p i (cid:19) + . . . } . (21)When q > G ( t ) = exp[( q − t ] − q − and a very similar expansion holds: S q [ p ] = k B W X i =1 p i { ln 1 p i + 12 ( q − (cid:18) ln 1 p i (cid:19) + 16 ( q − (cid:18) ln 1 p i (cid:19) + . . . } . (22)c) The Kaniadakis entropy [22] S κ [ p ] = k B W X i =1 p i p − κi − p κi κ (23)is obtained by means of the choice G ( t ) = exp( κt ) − exp( − κt )2 κ , c = 1. The decom-position (18) of the Kaniadakis entropy is given by S κ [ p ] = k B W X i =1 p i ( ln 1 p i + 13! κ (cid:18) ln 1 p i (cid:19) + 15! κ (cid:18) ln 1 p i (cid:19) + 17! κ (cid:18) ln 1 p i (cid:19) + . . . ) (24)where − < k ≤ S c,d entropy , introduced in [17]. This beautifulentropic form was obtained by taking into account two scaling laws that emergefrom the requirement of the first three SK axioms. It reads S c,d = e − c + cd W X i =1 Γ(1 + d, − c ln p i ) − c − c + cd . (25)13ere Γ( s, x ) denotes the upper incomplete Gamma function (and k B = 1), c ∈ (0 , d ∈ R .The authors analyzed essentially all entropies known in the literature thatsatisfy the axioms (SK1)–(SK3), with the exception of group entropies, andobserved that these entropies can be considered as particular cases of the S c,d entropy, for a suitable choice of ( c, d ) (see also [19] for a recent discussion of therole of the S c,d entropy). These parameters appear as the exponents character-izing the two scaling laws.Consequently, in this Section, we will not compare exhaustively all the othercases already considered in [17] (as e.g. the Anteneodo-Plastino [4] and Shafeeentropy [39]). Instead, we shall discuss directly the relationship between the S c,d entropy and the universal-group entropy.Consider the identity Γ( s, x ) = Γ( s ) − γ ( s, x )where γ ( s, x ) is the lower incomplete gamma function, as well as the seriesexpansion of γ ( s, x ) for real positive values of its arguments. We obtain thatΓ(1 + d, − c ln p i ) = Γ(1 + d ) + t d ∞ X k =0 δ k t k +1 k + 1 , (26)where Γ( a ) is the Euler Gamma function, δ k = ( − k +1 ( k + 1) k !( k + d + 1) , t = (cid:20) ln ep ic (cid:21) . (27)In the subsequent considerations, we shall restrict to the case d ∈ N . To performour analysis we shall use the identity, valid for d ∈ N : Z ∞ K t d e − t dt = e − K d X n =0 Q nj =0 ( d − j + 1) d + 1 K d − n . (28)This identity (new, to the best of our knowledge) can be proven by a directcomputation.The identity (28) allows to expand the entropy in terms of the set { S k [ p ] } .In general, we have S c,d = 11 − c + cd W X i =1 p ic d X k =0 d + 1 k Y j =0 ( d − j + 1) d − k X n =0 (cid:18) d − kn (cid:19) (cid:18) ln 1 p ic (cid:19) n − c − c + cd , d ∈ N . (29)Let us see some particular cases of the previous formula. We easily recover thetwo examples of [17], i.e. S , [ p ] = 1 + X i p i ln 1 p i , S , [ p ] = 2 X i p i ln 1 p i ! + 12 X i p i (cid:18) ln 1 p i (cid:19) . c arbitrary, let us write explicitly, for instance, the functionals correspondingto d = 3: S c, [ p ] = 11 + 2 c W X i =1 p ic (
16 + 15 ln 1 p ic + 6 (cid:18) ln 1 p ic (cid:19) + (cid:18) ln 1 p ic (cid:19) ) − c c , (30)and d = 5: S c, [ p ] = 11 + 4 c W X i =1 p ic (cid:26)
326 + 325 ln 1 p ic + 160 (cid:18) ln 1 p ic (cid:19) + 50 (cid:18) ln 1 p ic (cid:19) + 10 (cid:18) ln 1 p ic (cid:19) + (cid:18) ln 1 p ic (cid:19) (cid:27) − c c . (31)From the previous analysis, it emerges that the S ( c, d ) entropy fits into theclass (10) in the two cases ( c = 1 , d ∈ N ) (we can get rid of the constant termin the expansion) and ( c > , d = 0).e) The S δ entropy has been introduced in [52] and independently in [55], andrecently discussed in [53] in relation with black-holes thermodynamics: S δ = k B W X i =1 p i (cid:18) ln 1 p i (cid:19) δ , < δ ≤ (1 + ln W ) . (32)The underlying algebraic structure can be analyzed on the uniform distri-bution (weak composability). We get easily the associated function Φ( x, y ) = (cid:2) x /δ + y /δ (cid:3) δ . Notice that the function Φ( x, y ) has not an expansion in termsof a formal power series around ( x, y ) = (0 , δ = 1, which is the onlystrictly composable case. Also, Φ( x, y ) does not define a group law over thereals, but simply a monoid, except for δ ∈ N , δ odd. A similar analysis can beperformed for the case of the entropic functional S q,δ = k B W X i =1 p i (cid:18) ln q p i (cid:19) δ . (33)It reduces to (20) for q ∈ R and δ = 1.f) The Borges-Roditi entropy is a two-parametric entropy, introduced in [9].The associated generalized logarithm reads
Log a,b ( x ) = x a − x b a − b , (34)which reproduces Abe’s entropy [1] for a = σ − , b = σ − −
1. It also generalizesthe entropy (23). The entropy (34) for a suitable choice of a, b satisfies the firstthree SK axioms (however, it is not related to the S c,d entropy). The relatedgroup exponential is G A ( t ) = e at − e bt a − b . (35)15he formal group corresponding to (35), giving the interaction rule for this class,is known in the literature as the Abel formal group , defined by [10]Φ A ( x, y ) = x + y + β xy + X j>i β i (cid:0) xy i − x i y (cid:1) . (36)The coefficients β n in (36) can be expressed as polynomials in a and b (seeProposition 3.1 of [10]): β = a + b, β n = ( − n − n !( n − Y i + j = n − i,j ≥ ( ia + jb ) , n > . The Borges-Roditi entropy possesses the following expansion: S BR [ p ] = k B W X i =1 p i ( ln 1 p i + 12 ( a + b ) (cid:18) ln 1 p i (cid:19) + 16 (cid:0) a + ab + b (cid:1) (cid:18) ln 1 p i (cid:19) . . . ) . (37)g) The original notion of group entropies was introduced in [45] to describean infinite family of weakly composable entropies. They are defined by S G ( p ) := k B W X i =1 p i Log G (cid:18) p i (cid:19) . (38)Here Log G denotes the generalized logarithm Log G ( x ) = 1 σ m X n = l k n x σn , l, m ∈ Z , m − l = r > , x > k n are real constants such that m X n = l k n = 0 , m X n = l nk n = 1 , (40)and k m = 0, k l = 0. Conditions (40) are sufficient to ensure that lim σ → Log G ( x ) =ln x . The class (38) is a subclass of (10), corresponding to the choice G ( t ) = 1 σ m X n = l k n exp( nσt ), l , m ∈ Z , m − l = r >
0, (41)with the constraints (40).Notice that there is much freedom in choosing the coefficients k n , since only twoconditions are required to guarantee the correct limit.Due to the specific choice of the formal group structure, the generalizedlogarithm (39) is intimately related to a family of discrete derivatives. Indeed,let us denote by T the shift operator, acting on a function f as T f ( x ) = f ( x + σ ).16he discrete derivatives of order r associated to the generalized logarithms (39)are defined to be ∆ r ( σ ) = 1 σ m X n = l k n T n , r = m − l. In the limit σ →
0, ∆ r ( σ ) = ∂ x . For a suitable choice of the values of theparameter σ , each of the obtained entropies is concave, and weakly composable.The relation between group entropies and the class S c,d was an open question.Actually, the two classes are simply different . In other words, the entropies (38),with the exception of Tsallis entropy, cannot be obtained from S c,d entropy byspecializing the coefficients ( c, d ).The first representative of the group-entropy class is indeed the Tsallis entropy,obtained for Log G ( x ) = x σ − σ = Log T ( x ) = x − q − − q (here σ = 1 − q ). The secondrepresentative of the class, the Kaniadakis entropy, is not in the S c,d family.Consider the logarithm Log
III ( x ) = x − q − x − (1 − q ) + x − − q ) − q (associated to athird order discrete derivative). The corresponding entropy is S III := k B − q W X i =1 p i (cid:16) p − q ) i − p (1 − q ) i + p − (1 − q ) i (cid:17) (42)This entropy is concave for 2 / < q < q = 1 − σ ). In the limit q →
1, the entropy S III reduces to the Boltzmann-Gibbsentropy. Also, the entropy S III is weakly composable, since it corresponds to G ( t ) = e (1 − q ) t − e − (1 − q ) t + e − − q ) t − q . However, it is not obtainable neither fromBorges-Roditi’s nor from S c,d entropy. Its expansion is given by S III [ p ] = k B W X i =1 p i ( ln 1 p i + 32 (1 − q ) (cid:18) ln 1 p i (cid:19) −
56 (1 − q ) (cid:18) ln 1 p i (cid:19) . . . ) Remark 6.
Entropy (42) provides an interesting example of a functional whichis in the universal class (10), but it does not satisfy condition (11). Indeed, thiscondition, although sufficient, is not necessary to define an entropic functionalwith good thermodynamic properties.Another entropy, with very similar properties, is the following one, correspond-ing to a fourth order discrete derivative: S IV := k B − q W X i =1 p i (cid:18) p − − q ) i − p − (1 − q ) i + 32 p (1 − q ) i − p − q ) i (cid:19) . By using the construction proposed in [45], infinitely many new entropies, eachof them depending on a parameter q , and defined in a specific interval of valuesof q , can be constructed. They are not particular cases of any of the previouslydiscussed entropies. At the same time, being trace-form and satisfying the17xioms (SK1)–(SK3), they comply with the two scaling laws postulated in [17](as is also easy to prove directly).Both families, i.e. group entropies and the S c,d entropies can be interpreted inthe universal group-theoretical framework. Indeed, the S c,d entropy for specificchoices of ( c, d ) is expressible in terms of the entropy (10) by means of the generalformula (29) and is consequently weakly composable. The group entropies (38)by construction are a subfamily of the entropy (10).
6. A new three-parametric group entropy: the S q,α,β entropy To illustrate the potential richness of the theory previously developed, wewish to present here a new nontrivial multi-parametric entropic functional, ob-tained as a special case of the construction sketched above. It reads S α,β,q [ p ] := k B − q W X i =1 p i (cid:18) αp − − q ) i + 12 (1 − α + β ) p − (1 − q ) i +12 ( α − − β ) p (1 − q ) i + βp − q ) i (cid:19) . (43)The main properties of this entropy are the following. i ) lim q → S α,β,q [ p ] = S BG [ p ]irrespectively of the choice of the parameters. ii ) The entropy (43) is trace-form. iii ) It satisfies the first three SK axioms; in particular, it is concave, for instance,for 1 / < q < /
2, 0 < α < / − / < β < iv ) The parameters α, β, q are independent; consequently, the entropy (43) isnot reducible to the previously discussed two-parametric classes. v ) It is weakly composable, with expansion S α,β,q [ p ] := k B W X i =1 p i (cid:26) ln 1 p i + 32 ( α + β )(1 − q ) (cid:18) ln 1 p i (cid:19) +16 (1 + 6 α − β ) (1 − q ) (cid:18) ln 1 p i (cid:19) + . . . (cid:27) (44)The entropy (44) is in the group entropy class, but it was not considered ex-plicitly in [45]. Also, it is not a particular case of S c,d or S BR . This exampleis just a representative of a large class of entropic functionals, each of themmultiparametric, that can be constructed by using the group-theoretical frame-work previously discussed. However, the complete analysis of this class, whichis a specific realization of the notion of universal-group entropy, is outside thescopes of this paper. 18 . Distribution functions and thermodynamic properties We shall discuss the maximization of group entropies under appropriateconstraints: we adopt a generalized maximum entropy principle (see e.g. [23],[2]). We will see that the Legendre structure of classical thermodynamics, atleast in some aspects, is preserved in our group-theoretical framework.Precisely, let
Log U [ ǫ ] = G (ln ǫ )where G ( t ) is the power series (9), with the constraint (11). Consider an isolatedsystem in a stationary state ( microcanonical ensemble ). The optimization of S U leads to the equal probability case, i.e. p i = 1 /W, ∀ i . Therefore, we have S U [ p ] = k B Log U W, (45)which reduces to the celebrated Boltzmann formula S BG = k B ln W in the caseof uncorrelated particles.Let us consider a system in thermal contact with a reservoir ( canonical en-semble ). We introduce the numbers ǫ i , interpreted as the values of a physicallyrelevant observable, typically the value of the energy of the system in its i thstate. Assume that p i ( ǫ i ) is a normalized and monotonically decreasing dis-tribution function of ǫ i . The internal energy V in a given state is defined as V = P Wi =1 ǫ i p i ( ǫ i ).As usual, we shall study the variational problem of the existence of a sta-tionary distribution e p i ( ǫ ). However, this analysis can not be performed in directanalogy with the standard case. We introduce the functional L = S G [ p ] − α "X i p ( ǫ i ) − − β " W X i =1 ǫ i p i ( ǫ i ) − V , (46)where α and β are Lagrange multipliers. The vanishing of the variational deriva-tive of this functional with respect to the distribution p i provides the stationarysolution e p i = E ( − α − βǫ i ) Z , (47)with Z = P Wi =1 E ( − α − βǫ i ), and E ( · ) is an invertible function.However, as already pointed out in [45], only in particular cases we are able toidentify E with the inverse of a generalized logarithm Log G .In [23], [24], the class of entropies allowing a treatment with the variationalapproach described above has been determined. In the Legendre-Massieu frame-work, one can derive the interesting relation Log G ( Z ) + β V = S G . (48)The previous equation can be used to introduce a thermodynamic observable T , which has the interpretation of a local temperature for a non-equilibriummetastable state. Precisely, we can define it from ∂S U ∂ V = T . Analogously, a19eneralized free energy can be introduced according to F = V −
T S U . However,in the context of Tsallis entropy, the variational approach leading to the defi-nition of an effective temperature is based on more specific constraints (escortdistributions [52]).The determination of a group-theoretical procedure to construct the appro-priate constraints for an entropy of the class (10) is an open problem.
8. On the asymptotic behaviour of generalized entropies
From a mathematical point of view, the study of the asymptotic behaviourof a given entropy, in the limit of large size systems, is not a well defined task,even for the standard Boltzmann-Gibbs entropy.Let W be the number of microscopic states admitted by a system. We shallfocus on the case of large size systems (under the hypothesis that W → ∞ for N → ∞ ). An entropy is then a functional S = S [ p ] defined on the space P ∞ .A simple argument proves that, depending on the choice of the distribution in P ∞ , we can get infinitely many different limits even for the S BG case. Indeed,consider the probability distribution { p } = (1 , , , . . . ) with infinitely manyentries. Then S BG [ p ] = 0. If { p } = (1 / , / , , , . . . ), then S BG [ p ] = ln 2(in units of k B ). Let N = 10 (i.e., the estimated number of atoms in theobservable Universe), and { p } = (1 / N , / N , . . . , / N | {z } N − times , , , . . . ), then S BG [ p ] =80 · ln 10 ≃ .
2. From the point of view of probability and information theory,there is no way to have uniqueness of the limit on the full space P ∞ .In the domain of classical thermodynamics, a priori the same objection ap-plies. However, if we accept to restrict to the important case of the uniformdistribution (which is not the only physically interesting case), then one can de-fine properly a thermodynamic limit of an entropy on the uniform distribution .From a physical perspective, this would correspond to the case of a double limit,both of large system size and of large times.The only meaningful way to compare the behaviour of different entropiesis to compute them all in the same regime . Once we accept to restrict touniform probabilities, then the known entropies assume the form of the followingasymptotic functions: either ( lnW ) a or W b (possibly multiplied by parameters),or the product of these forms. Presently, we cannot exclude that other formscould also be possible.All this is a consequence of the analysis of scaling laws performed in [17].However, it must be noticed that these simple functions of W are not entropicfunctionals, and they cannot be obtained from a known entropy by specializingits parameters.The problem of “comparing” entropies in the large size limit finds its mostsimple answer in the regime, if exists, where they are extensive. In classicalthermodynamics, the only reason to consider generalized entropies is the factthat they can be extensive in regimes where the S BG entropy is not. This is thetruly important limiting property. 20hat is crucial is that all admissible entropies have the same behavior in theregime where they are extensive, i.e. proportional to the number N of particlesof the system.Therefore, it is not surprising that different entropies could share the sameasymptotic behavior.When we are interested in more general contexts as probability theory andinformation theory, then all of the infinitely many possible distributions are apriori relevant. Therefore, according to the previous discussion, the comparisonamong asymptotic behaviors of different entropies loses its meaning.
9. Open problems and future perspectives
The main message of this paper is that the notion of entropy is intimatelyrelated with group theory . Indeed, admissible entropies have associated a grouplaw, controlling crucially their properties.The universal-group entropy is a flexible tool, providing new insight in dif-ferent contexts of the theory of complex systems. For instance, it could offer amore general approach to the description of the evolution of biological complex-ity [3]. The role of the universal-group entropy in geometric information theoryshould be properly clarified. New examples of Kullback-Leibler divergences arepresently under investigation. They could provide refined tests of similarity, forinstance, in the analysis of genomic sequences. A nonadditive entropy takenfrom the class (10) a priori could be more appropriate than the Shannon one,in order to take into account the so-called epistatic correlations between sitesin a genomic chain.An open problem is to find constructively classes of new entropies obtainedas special cases of the general definition (10). New results along these lines arecontained in [47].Another important aspect that deserves to be clarified is the possible number-theoretical content of the notion of universal group entropy. In [45], a connec-tion between a set of entropic functionals and a class of Dirichlet series has beenestablished. It has been proved that the formal group exponentials defining aspecific family of entropies can be used to define L -functions. The most paradig-matic case is offered by Tsallis entropy, directly related to the Riemann zetafunction. A construction of universal congruences and generalized Bernoullipolynomials connected with formal groups has been provided in [46] and in [43].It would be interesting to realize explicitly this picture in the general group-theoretical framework developed in this work. Appendix A. The Shannon-Khinchin axioms
In the original formulation due to Khinchin [25], the Shannon-Khinchin ax-ioms were called “properties” that guarantee Khinchin’s uniqueness theorem forthe Boltzmann-Gibbs entropy S . In particular, the properties of continuity andmaximum were unified into a unique statement. Here we propose a modernversion, in terms of four requirements.21SK1) (Continuity). The function S ( p , . . . , p W ) is continuous with respectto all its arguments(SK2) (Maximum principle). The function S ( p , . . . , p W ) takes its maximumvalue for the uniform distribution p i = 1 /W , i = 1 , . . . , W .(SK3) (Expansibility). Adding an impossible event to a probability distri-bution does not change its entropy: S ( p , . . . , p W ,
0) = S ( p , . . . , p W ).(SK4) (Additivity). Given two subsystems A , B of a statistical system, S ( A ∪ B ) = S ( A ) + S ( B | A );here S ( B | A ) denotes the conditional entropy associated with the conditionaldistribution p ij ( B | A ). Acknowledgments . I wish to thank heartily prof. C. Tsallis for a carefulreading of the manuscript and many useful discussions, and prof. G. Parisi forreading the manuscript and for encouragement. Interesting discussions withprof. A. Gonz´alez-L´opez, R. A. Leo and M. A. Rodr´ıguez are also gratefullyacknowledged.This work has been supported by the research project FIS2011–22566, Min-isterio de Ciencia e Innovaci´on, Spain.
References [1] S. Abe, Generalized entropy optimized by a given arbitrary distribution, J.Phys. A: Math. Gen. (2008) 1004.[3] C. Adami, C. Ofria and T. C. Collier, Evolution of biological complexity,Proc. Natl. Acad. Sci. USA (2000) 4463–4468.[4] C. Anteneodo and A. R. Plastino, Maximum entropy approach to stretchedexponential probability distributions, J. Phys. A: Math. Gen. , 1089 (1999).[5] A. Baker, Combinatorial and arithmetic identities based on formal grouplaws, Lect. Notes in Math. , 17–34, Springer (1987).[6] C. Beck and E. D. G. Cohen, Superstatistics, Physica A , 192–201 (1946).[9] E. P. Borges and I. Roditi, A family of nonextensive entropies, Phys. Lett.A Formal groups, functional equa-tions and generalized cohomology theories , Math. USSR Sbornik (1990)75–94, English transl. Math. S. B. (1991), 77–97.[11] V. M. Bukhshtaber, A. S. Mishchenko and S. P. Novikov, Formal groupsand their role in the apparatus of algebraic topology, Uspehi Mat. Nauk (1971), 2, 161–154, transl. Russ. Math. Surv. , 63–90 (1971).[12] V. M. Bukhshtaber and S. P. Novikov, Formal groups, power systems andAdams operators, Math. Sb. , 116-153 (1971).[13] A. Baker, F. Clarke, N. Ray, L. Schwartz, On the Kummer congruences andthe stable homotopy of BU, Trans. Amer. Math. Soc. , 385–432 (1989).[14] H. B. Callen, Thermodynamics and an Introduction to Thermostatistics , IIedition, John Wiley and Sons (1985).[15] N. Canosa and R. Rossignoli, Generalized nonadditive entropies and quan-tum entanglement, Phys. Rev. Lett. The Quark and the Jaguar: Adventures in the Simple andthe Complex , Macmillan (1995).[17] R. Hanel and S. Thurner, A comprehensive classification of complex statis-tical systems and an axiomatic derivation of their entropy and distributionfunctions, Europhys. Lett. , 6390–6394 (2012).[19] R. Hanel, S. Thurner and M. Gell-Mann, How multiplicity determines en-tropy: derivation of the maximum entropy principle for complex systems,PNAS 111 (19), 6905–6910 (2014).[20] M. Hazewinkel, Formal Groups and Applications , Academic Press, NewYork, 1978[21] R. Horodecki, P. Horodecki, M. Horodecki, and K. Horodecki, QuantumEntanglement, Rev. Mod. Phys. , 865 (2009).[22] G. Kaniadakis, Statistical mechanics in the context of special relativity,Phys. Rev. E (2004) 41–49;Phys. Rev. E (2005) 046128.[25] A. I. Khinchin, Mathematical Foundations of Information Theory , Dover,New York, 1957. 2326] S. Kullback and R. A. Leibler, On Information and Sufficiency, Ann. ofMath. Stat. , 79–86 (1951).[27] J. L. W. V. Jensen, Sur les fonctions convexes et les in´egalit´es entre lesvaleurs moyennes, Acta Math. Thermodynamics , Interscience, New York, 1961.[29] B. Lesche, Instabilities of R´enyi entropies, J. Stat. Phys. , 419 (1982).[30] S. Marmi and P. Tempesta, Hyperfunctions, formal groups and generalizedLipschitz summation formulas, Nonlinear Analysis , 1768–1777 (2012).[31] J. Naudts, Deformed exponentials and logarithms in generalized thermo-statistics, Physica A, , 323-334 (2002).[32] J. Naudts, Generalised exponential families and associated entropy func-tions, Entropy, , 131–149 (2008).[33] S. P. Novikov, The methods of algebraic topology from the point of viewof cobordism theory, Izv. Akad. Nauk SSSR Ser. Mat. (1967), 885–951,transl. Math. SSR–Izv. (1967), 827–913.[34] N. A. Peters, T.-C. Wei, P. G. Kwiat, Mixed state sensitivity of severalquantum information benchmarks, Physical Review A (5), 052309 (2004).[35] D. Quillen, On the formal group laws of unoriented and complex cobordismtheory, Bull. Amer. Math. Soc. , 1293–1298 (1969).[36] N. Ray, Stirling and Bernoulli numbers for complex oriented homologytheory, in Algebraic Topology, Lecture Notes in Math. 1370, pp. 362–373,G. Carlsson, R. L. Cohen, H. R. Miller and D. C. Ravenel (Eds.), Springer–Verlag, 1986.[37] A. R´enyi, Probability Theory , North–Holland, Amsterdam, 1970.[38] J.–P. Serre,
Lie algebras and Lie groups , Lecture Notes in Mathematics,1500 Springer–Verlag (1992).[39] F. Shafee, Lambert function and a new nonextensive form of entropy, IMAJ. Appl. Math. , 785 (2007).[40] C. E. Shannon, A mathematical theory of communication, Bell Syst. Tech.J. (1948) 379–423, The mathematical Theory of Communication ,University of Illinois Press, Urbana, IL, 1949.[42] P. Tempesta, Formal groups, Bernoulli–type polynomials and L –series, C.R. Math. Acad. Sci. Paris, Ser. I , 303–306 (2007).2443] P. Tempesta, On Appell sequences of polynomials of Bernoulli and Eulertype, J. Math. Anal. Appl. , 1295–1310 (2008).[44] P. Tempesta, L–series and Hurwitz zeta functions associated with the uni-versal formal group, Annali Sc. Normale Superiore, Classe di Scienze, IX,1–12 (2010).[45] P. Tempesta, Group entropies, correlation laws and zeta functions, Phys.Rev. E , 021121 (2011).[46] P. Tempesta, The Lazard formal group, universal congruences and specialvalues of zeta functions, Transactions of the American Mathematical Society, , 7015-7028 (2015).[47] P. Tempesta, A theorem on the existence of generalized trace-form en-tropies, preprint (2015).[48] P. Tempesta, A new family of composable entropies from group theory: theZ-entropies, arxiv: 1507.07436 (2015).[49] http://tsallis.cat.cbpf.br/TEMUCO.pdf[50] C. Tsallis, Possible generalization of the Boltzmann–Gibbs statistics, J.Stat. Phys. , Nos. 1/2, 479–487 (1988).[51] C. Tsallis, Generalized entropy–based criterion for consistent testing, Phys.Rev. E , 1442–1445 (1998).[52] C. Tsallis, Introduction to Nonextensive Statistical Mechanics–Approachinga Complex World , Springer, Berlin (2009).[53] C. Tsallis, L. Cirto, Black hole thermodynamical entropy, Eur. Phys. J. C , 2487 (2013).[54] C. Tsallis, M. Gell–Mann and Y. Sato, Asymptotically scale–invariant oc-cupancy of phase space makes the entropy S q extensive, Proc. Nat. Acad. Sci.USA, , 15377–15382 (2005).[55] M. R. Ubriaco, Entropies based on fractional calculus, Phys. Lett. A ,2516–2519 2009.[56] A. Wehrl, General properties of entropy, Rev. Mod. Phys.50