aa r X i v : . [ q -f i n . M F ] S e p Eventological H -theorem Oleg Yu. Vorobyev
Abstract
We prove the eventological H -theorem that complements the Boltzmann H -theorem from statistical mechanics [1] and serves as a mathematicalexcuse (mathematically no less convincing than the Boltzmann H -theoremfor the second law of thermodynamics) for what can be called “the secondlaw of eventology”, which justifies the application of Gibbs and “anti-Gibbs”distributions [2] of sets of events minimizing relative entropy, as statisticalmodels of the behavior of a rational subject, striving for an equilibriumeventological choice between perception and activity in various spheres ofher/his co-being.
1. Eventological H -theoremon extreme properties of Gibbs and “anti-Gibbs”eventological distributions Theorem (eventological H -theorem) . Let (Ω , F , P ) be the eventologicalspace, X ⊆ F be the finite set of events, V ( X ) be nonnegative boundedfunction on X , p ∗ ( X ) be some fixed eventological distribution on X , andlet eventological distributions p ( X ) on X keep a mean value of the function V ( X ) at the given level h V i = X X ⊆ X p ( X ) V ( X ) . ( V ) Then the minimum of relative entropy H pp ∗ = X X ⊆ X p ( X ) ln p ( X ) p ∗ ( X ) → min p c (cid:13) Oleg Vorobyev (ed.), Financial and Actuarial Mathematics and Related Fields, Proceedings of FAM’2008 Conference, Krasnoyarsk, Russia In eventology, the V ( X ) is interpreted as a value (for a rational subject) of the set ofevents X ⊆ X occurrence on (Ω , F , P ) . The VII All-Russian FAM’2008 Conference among all eventological distributions p is achieved on Gibbs and anti-Gibbseventological distributions of the following form: p ( X ) = 1 Z p ∗ exp n − β V ( X ) o p ∗ ( X ) , X ⊆ X , β ≥ ,p ( X ) = 1 Z p ∗ exp n γ V ( X ) o p ∗ ( X ) , X ⊆ X , γ ≥ , which can be rewritten without a normalizing factor /Z p ∗ in the equivalentform: p ( X ) p ( ∅ ) = exp n − β ( V ( X ) − V ( ∅ )) o p ∗ ( X ) p ∗ ( ∅ ) , X ⊆ X ,p ( X ) p ( ∅ ) = exp n γ ( V ( X ) − V ( ∅ )) o p ∗ ( X ) p ∗ ( ∅ ) , X ⊆ X . P r o o f uses the idea of proof of one variant of Boltzmann H -theoremfrom statistical mechanics (in the formulation taken from [4, p. 41]). Asit turned out, this long-standing idea is enough to get much more generalconclusions under classical assumptions.Let us compare the relative entropy for the Gibbs factor f ( X ) = exp ( − β V ( X ) ) p ∗ ( X ) , β ≥ , or for the Gibbs “anti-factor” f ( X ) = exp ( γ V ( X ) ) p ∗ ( X ) , γ ≥ , by introducing the general notation for them f ( X ) = exp ( α V ( X ) ) p ∗ ( X ) (where α ∈ R is a real arbitrary-sign parameter), with the relative entropyfor any function ϕ ( X ) that is normalized to the same factor Z p ∗ as thefunction f ( X ) is normalized to. The term is proposed by me, has no analogs in statistical physics. In this case, we always have p ( X ) = Z p ∗ f ( X ) . orobyev g ( X ) such that ϕ ( X ) = f ( X ) · g ( X ) , we find H fp ∗ − H ϕp ∗ = 1 Z p ∗ X X ∈ X (cid:20) f ( X ) ln f ( X ) p ∗ ( X ) − ϕ ( X ) ln ϕ ( X ) p ∗ ( X ) (cid:21) == 1 Z p ∗ X X ∈ X f ( X ) (cid:20) ln f ( X ) p ∗ ( X ) − g ( X ) ln f ( X ) g ( X ) p ∗ ( X ) (cid:21) . ( B The normalization of probabilities gives X X ∈ X " ϕ ( X ) − f ( X ) = X X ∈ X f ( X ) " g ( X ) − = 0 , ( B and the theorem condition ( V ) gives X X ∈ X " ϕ ( X ) − f ( X ) V ( X ) = 0 . ( V Given that α V ( X ) = ln f ( X ) p ∗ ( X ) , we obtain from ( V and ( B Z p ∗ X X ∈ X " ϕ ( X ) − f ( X ) ln f ( X ) p ∗ ( X ) == 1 Z p ∗ X X ∈ X f ( X ) " g ( X ) − ln f ( X ) p ∗ ( X ) = 0 . ( B Subtracting ( B and ( B from ( B , we obtain H fp ∗ − H ϕp ∗ = − Z p ∗ X X ∈ X f ( X ) " g ( X ) ln g ( X ) − g ( X ) + 1 . By the definition, the function f ( X ) is positive, and the variable " g ln g − g + 1 = Z g ln gdg is non-negative for any positive g . Hence, H fp ∗ − H ϕp ∗ ≤ , i.e. the function H ϕp ∗ is always not less than H fp ∗ . The theorem is proved.4 The VII All-Russian FAM’2008 Conference
2. Interpretations of the eventological H -theorem When the relative entropy H pp ∗ of a physical system (with distribution p )relative to the environment (with distribution p ∗ ) has a minimum value, instatistical thermodynamics it is considered that the system is in equilibriumwith the surrounding medium, and its decrease with time corresponds to anapproximation to equilibrium with a given medium .This physical analogy is used by us to construct a more general eventolo-gical model of the behavior of a rational subject based on the idea of anequilibrium choice between the perception and activity , to which the rationalsubject is doomed in the process of co-being. When the relative entropy H pp ∗ of a rational subject (with distribution p ) relative to one’s own tastes andpreferences (with p ∗ distribution) is minimal, in eventology it is consideredthat the rational subject is in equilibrium with her/himself, and the decreasein relative entropy with time corresponds to the striving of the rationalsubject to balance with her/himself — “the second law of eventology” .Therefore, the proved eventological H -theorem serves as a mathematicalexcuse for what can be called “the second law of eventology ”, which justifiesthe use of Gibbs and anti-Gibbs distributions [2] of sets of events thatminimize relative entropy as statistical models of the behavior of the rationalsubject, striving for an equilibrium eventological choice between perceptionand activity in various spheres of her/him co-being. In the eventological H -theorem, as in its physical predecessor, it is consideredwe know the following distribution V ( X ) , X ⊆ X , This is the principle of minimum relative entropy of the system for a fixed level ofentropy of the environment equivalent to the maximum entropy principle — thecornerstone of the second law of thermodynamics: “the increase in the entropy of thesystem as it approaches to equilibrium”. “The change of the maximum to the minimum”is explained by formal differences in the sign between the traditional definitions ofentropy: P X ⊆ X p ( X ) ln p ( X ) and relative entropy: P X ⊆ X p ( X ) ln( p ( X ) /p ∗ ( X )) . Mathematically no less convincing than the Boltzmann H -theorem for the second lawof thermodynamics. orobyev V of subsets of events from X that is defined on X , and the fixed eventological distribution p ∗ ( X ) , X ⊆ X of the set of events X . Then the range of considered eventological distributionsof the set of events X is limited by such eventological distributions { p ( X ) , X ⊆ X } , that satisfies the restriction h V i = X X ⊆ X p ( X ) V ( X ) . In other words, these eventological distributions “keep” the p -mean value ofset-function of value V at a fixed level h V i : E p ( V ) = X X ⊆ X p ( X ) V ( X ) = h V i . The theorem states that among the given eventological distributions therelative entropy H pp ∗ reaches a minimum on Gibbs and anti-Gibbs eventologi-cal distributions of the form p ( X ) p ( ∅ ) = exp n − β ( V ( X ) − V ( ∅ )) o p ∗ ( X ) p ∗ ( ∅ ) , X ⊆ X , ( G ) p ( X ) p ( ∅ ) = exp n γ ( V ( X ) − V ( ∅ )) o p ∗ ( X ) p ∗ ( ∅ ) , X ⊆ X . ( − G ) Relative entropy “measures” the deviation of one eventological distributionfrom another and reaches a minimum equal to zero when these eventologicaldistributions coincide.Therefore, the eventological H -theorem actually asserts that, of amongall eventological distributions which “keep” at a given level the mean value ofthe set-function of value V , the Gibbs and anti-Gibbs eventological distribu-tions lie “closest” to the fixed eventological distribution p ∗ .Moreover, for the Gibbs and anti-Gibbs eventological distributions thatminimize the relative entropy, the mean value h V i of set-function V is closelyrelated to the relative entropy of these eventological distributions relativeto p ∗ . Indeed, from ( G ) and ( − G ) it follows that V ( X ) = − β ln p ( X ) p ∗ ( X ) − β ln p ∗ ( ∅ ) p ( ∅ ) + V ( ∅ ) , The VII All-Russian FAM’2008 Conference V ( X ) = 1 γ ln p ( X ) p ∗ ( X ) + 1 γ ln p ∗ ( ∅ ) p ( ∅ ) + V ( ∅ ) , Therefore h V i = 1 β H pp ∗ − β ln p ∗ ( ∅ ) p ( ∅ ) + V ( ∅ ) , h V i = − γ H pp ∗ + 1 γ ln p ∗ ( ∅ ) p ( ∅ ) + V ( ∅ ) . If we interpret: • the set-function of value V as a characteristics of current “market”conjuncture, i.e. “market” medium that surrounds the rational subjectin the “market of perception and activity”; • the fixed eventological distribution p ∗ as the past “market” experienceof the rational subject; • the sought eventological distribution p as a result of the interaction ofthe past “ market ” experience of the rational subject with the current“market” environment,then Gibbs factor and/or anti-factor exp {− β V ( X ) } ; exp { γ V ( X ) } should be interpreted as a conditional eventological distribution of the curr-ent behavior of a rational subject under the condition of her/his past “market”experience and the current “market” conjuncture. Gibbs factor and anti-factor as conditional eventological distri-butions . The formulas for the Gibbs and anti-Gibbs eventological distribu-tions containing the Gibbs factor and anti-factor can be looked at as theformulas of conditional probability: p ( X ) = p ( X | Y ) p ∗ ( Y ) . For example, p ↓ ( X ) = exp (cid:8) − β ( X, Y ) V ↓ ( X ) (cid:9) p ∗↓ ( Y ) ,p ↑ ( X ) = exp (cid:8) γ ( X, Y ) V ↑ ( X ) (cid:9) p ∗↑ ( Y ) . orobyev H -theorem, we are talkingabout the fact that the conditional eventological distributions that minimizethe relative entropy have the form of the Gibbs factor p ↓ ( X | Y ) = exp (cid:8) − β ( X, Y ) V ↓ ( X ) (cid:9) for events of perception and the form of the Gibbs anti-factor p ↑ ( X | Y ) = exp (cid:8) γ ( X, Y ) V ↑ ( X ) (cid:9) p ∗↑ ( Y ) for events of activity. We consider an example of functions and eventological distributions, thatare present in the eventological H -theorem, for the monoplet of the events X = { x } : • for events of perception: V ↓ ( ∅ ) , V ↓ ( x ); { p ∗↓ ( ∅ ) , p ∗↓ ( x ) } = { − p ∗↓ ( x ) , p ∗↓ ( x ) } ; h V ↓ i = V ↓ ( ∅ ) p ↓ ( ∅ ) + V ↓ ( x ) p ↓ ( x ) ,p ↓ ( x ) p ↓ ( ∅ ) = exp (cid:8) − β ( V ↓ ( x ) − V ↓ ( ∅ )) (cid:9) p ∗↓ ( x ) p ∗↓ ( ∅ ) ; • for events of activity: V ↑ ( ∅ ) , V ↑ ( x ); { p ∗↑ ( ∅ ) , p ∗↑ ( x ) } = { − p ∗↑ ( x ) , p ∗↑ ( x ) } ; h V ↑ i = V ↑ ( ∅ ) p ↑ ( ∅ ) + V ↑ ( x ) p ↑ ( x ) ,p ↑ ( x ) p ↑ ( ∅ ) = exp (cid:8) γ ( V ↑ ( x ) − V ↑ ( ∅ )) (cid:9) p ∗↑ ( x ) p ∗↑ ( ∅ ) . R e f e r e n c e s [1]
Boltzmann L. (1872) Weitere Studien ¨uber das W¨armegleichgewichtunter Gasmolek¨ulen.
Wiener Berichte , 66: 275-370.[2]
Vorobyev O.Yu. (2007)
Eventology . — Krasnoyarsk: SFU, 435 p (inRussian) .8 The VII All-Russian FAM’2008 Conference [3]
Shannon C.E. (1949) A Mathematical Theory of Communication.
BellSystem Technical Journal , vol. 27, 379–423, 623–656.[4]
Isihara A. (1971)
Statistical physics . — New York, London: AcademicPress.[5]
Vorobyev A.O. (2002) Multicovariances and manypoint-dependentdistributions of random sets.
Proceedings of the I All-Russian FAM’2002Conference (Oleg Vorobyev ed.). Part I . — Krasnoyarsk: SFU, 21–24 (inRussian).[6]
Vorobyev O.Yu. (2008) Multicovariances of events.