Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Adom Giffin is active.

Publication


Featured researches published by Adom Giffin.


arXiv: Data Analysis, Statistics and Probability | 2007

Updating Probabilities with Data and Moments

Adom Giffin; Ariel Caticha

We use the method of Maximum (relative) Entropy to process information in the form of observed data and moment constraints. The generic “canonical” form of the posterior distribution for the problem of simultaneous updating with data and moments is obtained. We discuss the general problem of non‐commuting constraints, when they should be processed sequentially and when simultaneously. As an illustration, the multinomial example of die tosses is solved in detail for two superficially similar but actually very different problems.


Physica A-statistical Mechanics and Its Applications | 2009

From physics to economics: An econometric example using maximum relative entropy

Adom Giffin

Econophysics, is based on the premise that some ideas and methods from physics can be applied to economic situations. We intend to show in this paper how a physics concept such as entropy can be applied to an economic problem. In so doing, we demonstrate how information in the form of observable data and moment constraints are introduced into the method of Maximum relative Entropy (MrE). A general example of updating with data and moments is shown. Two specific econometric examples are solved in detail which can then be used as templates for real world problems. A numerical example is compared to a large deviation solution which illustrates some of the advantages of the MrE method.


Applied Mathematics and Computation | 2010

Reexamination of an information geometric construction of entropic indicators of complexity

Carlo Cafaro; Adom Giffin; S. A. Ali; D.-H. Kim

Abstract Information geometry and inductive inference methods can be used to model dynamical systems in terms of their probabilistic description on curved statistical manifolds. In this article, we present a formal conceptual reexamination of the information geometric construction of entropic indicators of complexity for statistical models. Specifically, we present conceptual advances in the interpretation of the information geometric entropy (IGE), a statistical indicator of temporal complexity (chaoticity) defined on curved statistical manifolds underlying the probabilistic dynamics of physical systems.


arXiv: Classical Physics | 2006

An Application of Reversible Entropic Dynamics on Curved Statistical Manifolds

Carlo Cafaro; Saleem A. Ali; Adom Giffin

Entropic Dynamics (ED) is a theoretical framework developed to investigate the possibility that laws of physics reflect laws of inference rather than laws of nature. In this work, a RED (Reversible Entropic Dynamics) model is considered. The geometric structure underlying the curved statistical manifold Ms is studied. The trajectories of this particular model are hyperbolic curves (geodesics) on Ms. Moreover, some analysis about the stability of these geodesics on Ms is carried out.


Open Systems & Information Dynamics | 2012

Softening the complexity of entropic motion on curved statistical manifolds

Carlo Cafaro; Adom Giffin; Cosmo Lupo; Stefano Mancini

We study the information geometry and the entropic dynamics of a three-dimensional Gaussian statistical model. We then compare our analysis to that of a two-dimensional Gaussian statistical model obtained from the higher-dimensional model via introduction of an additional information constraint that resembles the quantum mechanical canonical minimum uncertainty relation. We show that the chaoticity (temporal complexity) of the two-dimensional Gaussian statistical model, quantified by means of the information geometric entropy (IGE) and the Jacobi vector field intensity, is softened with respect to the chaoticity of the three-dimensional Gaussian statistical model.


Entropy | 2014

Simultaneous State and Parameter Estimation Using Maximum Relative Entropy with Nonhomogenous Differential Equation Constraints

Adom Giffin; Renaldas Urniezius

In this paper, we continue our efforts to show how maximum relative entropy (MrE) can be used as a universal updating algorithm. Here, our purpose is to tackle a joint state and parameter estimation problem where our system is nonlinear and in a non-equilibrium state, i.e., perturbed by varying external forces. Traditional parameter estimation can be performed by using filters, such as the extended Kalman filter (EKF). However, as shown with a toy example of a system with first order non-homogeneous ordinary differential equations, assumptions made by the EKF algorithm (such as the Markov assumption) may not be valid. The problem can be solved with exponential smoothing, e.g., exponentially weighted moving average (EWMA). Although this has been shown to produce acceptable filtering results in real exponential systems, it still cannot simultaneously estimate both the state and its parameters and has its own assumptions that are not always valid, for example when jump discontinuities exist. We show that by applying MrE as a filter, we can not only develop the closed form solutions, but we can also infer the parameters of the differential equation simultaneously with the means. This is useful in real, physical systems, where we want to not only filter the noise from our measurements, but we also want to simultaneously infer the parameters of the dynamics of a nonlinear and non-equilibrium system. Although there were many assumptions made throughout the paper to illustrate that EKF and exponential smoothing are special cases ofMrE, we are not “constrained”, by these assumptions. In other words, MrE is completely general and can be used in broader ways.


Entropy | 2014

The Kalman Filter Revisited Using Maximum Relative Entropy

Adom Giffin; Renaldas Urniezius

In 1960, Rudolf E. Kalman created what is known as the Kalman filter, which is a way to estimate unknown variables from noisy measurements. The algorithm follows the logic that if the previous state of the system is known, it could be used as the best guess for the current state. This information is first applied a priori to any measurement by using it in the underlying dynamics of the system. Second, measurements of the unknown variables are taken. These two pieces of information are taken into account to determine the current state of the system. Bayesian inference is specifically designed to accommodate the problem of updating what we think of the world based on partial or uncertain information. In this paper, we present a derivation of the general Bayesian filter, then adapt it for Markov systems. A simple example is shown for pedagogical purposes. We also show that by using the Kalman assumptions or “constraints”, we can arrive at the Kalman filter using the method of maximum (relative) entropy (MrE), which goes beyond Bayesian methods. Finally, we derive a generalized, nonlinear filter using MrE, where the original Kalman Filter is a special case. We further show that the variable relationship can be any function, and thus, approximations, such as the extended Kalman filter, the unscented Kalman filter and other Kalman variants are special cases as well.


arXiv: Mathematical Physics | 2012

On a differential geometric viewpoint of Jaynes' MaxEnt method and its quantum extension

S. A. Ali; Carlo Cafaro; Adom Giffin; Cosmo Lupo; Stefano Mancini

We present a differential geometric viewpoint of the quantum MaxEnt estimate of a density operator when only incomplete knowledge encoded in the expectation values of a set of quantum observables is available. Finally, the additional possibility of considering some prior bias towards a certain density operator (the prior) is taken into account and the unsolved issues with its quantum relative entropic inference criterion are pointed out.


Entropy | 2013

Local Softening of Information Geometric Indicators of Chaos in Statistical Modeling in the Presence of Quantum-Like Considerations

Adom Giffin; S. A. Ali; Carlo Cafaro

In a previous paper (C. Cafaro et al., 2012), we compared an uncorrelated 3D Gaussian statistical model to an uncorrelated 2D Gaussian statistical model obtained from the former model by introducing a constraint that resembles the quantum mechanical canonical minimum uncertainty relation. Analysis was completed by way of the information geometry and the entropic dynamics of each system. This analysis revealed that the chaoticity of the 2D Gaussian statistical model, quantified by means of the Information Geometric Entropy (IGE), is softened or weakened with respect to the chaoticity of the 3D Gaussian statistical model, due to the accessibility of more information. In this companion work, we further constrain the system in the context of a correlation constraint among the system’s micro-variables and show that the chaoticity is further weakened, but only locally. Finally, the physicality of the constraints is briefly discussed, particularly in the context of quantum entanglement.


Physica Scripta | 2012

Complexity characterization in a probabilistic approach to dynamical systems through information geometry and inductive inference

S. A. Ali; Carlo Cafaro; Adom Giffin; D. H. Kim

Information geometric techniques and inductive inference methods hold great promise for solving computational problems of interest in classical and quantum physics, especially with regard to complexity characterization of dynamical systems in terms of their probabilistic description on curved statistical manifolds. In this article, we investigate the possibility of describing the macroscopic behavior of complex systems in terms of the underlying statistical structure of their microscopic degrees of freedom by use of statistical inductive inference and information geometry. We review the Maximum Relative Entropy (MrE) formalism and the theoretical structure of the information geometrodynamical approach to chaos (IGAC) on statistical manifolds. Special focus is devoted to the description of the roles played by the sectional curvature, the Jacobi field intensity and the information geometrodynamical entropy (IGE). These quantities serve as powerful information geometric complexity measures of information-constrained dynamics associated with arbitrary chaotic and regular systems defined on the statistical manifold. Finally, the application of such information geometric techniques to several theoretical models are presented.

Collaboration


Dive into the Adom Giffin's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Cosmo Lupo

University of Camerino

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Philip Goyal

Perimeter Institute for Theoretical Physics

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

D. H. Kim

Ewha Womans University

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge