Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Mokshay M. Madiman is active.

Publication


Featured researches published by Mokshay M. Madiman.


IEEE Transactions on Information Theory | 2007

Generalized Entropy Power Inequalities and Monotonicity Properties of Information

Mokshay M. Madiman; Andrew R. Barron

New families of Fisher information and entropy power inequalities for sums of independent random variables are presented. These inequalities relate the information in the sum of n independent random variables to the information contained in sums over subsets of the random variables, for an arbitrary collection of subsets. As a consequence, a simple proof of the monotonicity of information in central limit theorems is obtained, both in the setting of independent and identically distributed (i.i.d.) summands as well as in the more general setting of independent summands with variance-standardized sums.


IEEE Transactions on Information Theory | 2010

Information Inequalities for Joint Distributions, With Interpretations and Applications

Mokshay M. Madiman; Prasad Tetali

Upper and lower bounds are obtained for the joint entropy of a collection of random variables in terms of an arbitrary collection of subset joint entropies. These inequalities generalize Shannons chain rule for entropy as well as inequalities of Han, Fujishige, and Shearer. A duality between the upper and lower bounds for joint entropy is developed. All of these results are shown to be special cases of general, new results for submodular functions-thus, the inequalities presented constitute a richly structured class of Shannon-type inequalities. The new inequalities are applied to obtain new results in combinatorics, such as bounds on the number of independent sets in an arbitrary graph and the number of zero-error source-channel codes, as well as determinantal inequalities in matrix theory. A general inequality for relative entropies is also developed. Finally, revealing connections of the results to literature in economics, computer science, and physics are explored.


information theory workshop | 2008

On the entropy of sums

Mokshay M. Madiman

It is shown that the entropy of a sum of independent random vectors is a submodular set function, and upper bounds on the entropy of sums are obtained as a result in both discrete and continuous settings. These inequalities complement the lower bounds provided by the entropy power inequalities of Madiman and Barron (2007). As applications, new inequalities for the determinants of sums of positive-definite matrices are presented.


Journal of Functional Analysis | 2012

Reverse Brunn-Minkowski and reverse entropy power inequalities for convex measures

Sergey G. Bobkov; Mokshay M. Madiman

We develop a reverse entropy power inequality for convex measures, which may be seen as an anegeometric inverse of the entropy power inequality of Shannon and Stam. The specialization of this inequality to log-concave measures may be seen as a version of Milman’s reverse Brunn-Minkowski inequality. The proof relies on a demonstration of new relationships between the entropy of high dimensional random vectors and the volume of convex bodies, and on a study of eective supports of convex measures, both of which are of independent interest, as well as on Milman’s deep technology of M-ellipsoids and on certain information-theoretic inequalities. As a by-product, we also give a


IEEE Transactions on Information Theory | 2011

The Entropy Per Coordinate of a Random Vector is Highly Constrained Under Convexity Conditions

Sergey G. Bobkov; Mokshay M. Madiman

The entropy per coordinate in a log-concave random vector of any dimension with given density at the mode is shown to have a range of just 1. Uniform distributions on convex bodies are at the lower end of this range, the distribution with i.i.d. exponentially distributed coordinates is at the upper end, and the normal is exactly in the middle. Thus, in terms of the amount of randomness as measured by entropy per coordinate, any log-concave random vector of any dimension contains randomness that differs from that in the normal random variable with the same maximal density value by at most 1/2. As applications, we obtain an information-theoretic formulation of the famous hyperplane conjecture in convex geometry, entropy bounds for certain infinitely divisible distributions, and quantitative estimates for the behavior of the density at the mode on convolution. More generally, one may consider so-called convex or hyperbolic probability measures on Euclidean spaces; we give new constraints on entropy per coordinate for this class of measures, which generalize our results under the log-concavity assumption, expose the extremal role of multivariate Pareto-type distributions, and give some applications.


IEEE Transactions on Information Theory | 2014

Beyond the Entropy Power Inequality, via Rearrangements

Liyao Wang; Mokshay M. Madiman

A lower bound on the Rényi differential entropy of a sum of independent random vectors is demonstrated in terms of rearrangements. For the special case of Boltzmann-Shannon entropy, this lower bound is better than that given by the entropy power inequality. Several applications are discussed, including a new proof of the classical entropy power inequality and an entropy inequality involving symmetrization of Lévy processes.


Annals of Probability | 2011

Concentration of the information in data with log-concave distributions

Sergey G. Bobkov; Mokshay M. Madiman

on which the distribution of Xis supported. In this case, eh(X) is essentially the number of bits neededto represent X by a coding scheme that minimizes average code length([Sha48]). In the continuous case (with reference measure dx), one may stillcall eh(X) the information content even though the coding interpretation nolonger holds. In statistics, one may think of the information content as thelog likelihood function.The average value of the information content of Xis known more com-monly as the entropy. Indeed, the entropy of Xis defined byh(X) = −Zf(x)logf(x)dx= −Elogf(X).


IEEE Transactions on Information Theory | 2014

Sumset and Inverse Sumset Inequalities for Differential Entropy and Mutual Information

Ioannis Kontoyiannis; Mokshay M. Madiman

The sumset and inverse sumset theories of Freiman, Plünnecke, and Ruzsa, give bounds connecting the cardinality of the sumset A + B = {a + b; a ∈ A, b ∈ B} of two discrete sets A, B, to the cardinalities (or the finer structure) of the original sets A, B. For example, the sum-difference bound of Ruzsa states that, |A + B| |A| |B| |A - B|3, where the difference set A - B = {a - b; a ∈ A, b ∈ B}. Interpreting the differential entropy h(X) of a continuous random variable X as (the logarithm of) the size of the effective support of X, the main contribution of this paper is a series of natural information-theoretic analogs for these results. For example, the Ruzsa sum-difference bound becomes the new inequality, h(X+Y)+h(X)+h(Y) ≤ 3h(X-Y), for any pair of independent continuous random variables X and Y. Our results include differential-entropy versions of Ruzsas triangle inequality, the Plünnecke-Ruzsa inequality, and the Balog-Szemerédi-Gowers lemma. In addition, we give a differential entropy version of a Freiman-type inverse-sumset theorem, which can be seen as a quantitative converse to the entropy power inequality. Versions of most of these results for the discrete entropy H(X) were recently proved by Tao, relying heavily on a strong, functional form of the submodularity property of H(X). Since differential entropy is not functionally submodular, in the continuous case many of the corresponding discrete proofs fail, in many cases requiring substantially new proof strategies. We find that the basic property that naturally replaces the discrete functional submodularity, is the data processing property of mutual information.


Eurasip Journal on Wireless Communications and Networking | 2008

Cores of cooperative games in information theory

Mokshay M. Madiman

Cores of cooperative games are ubiquitous in information theory and arise most frequently in the characterization of fundamental limits in various scenarios involving multiple users. Examples include classical settings in network information theory such as Slepian-Wolf source coding and multiple access channels, classical settings in statistics such as robust hypothesis testing, and new settings at the intersection of networking and statistics such as distributed estimation problems for sensor networks. Cooperative game theory allows one to understand aspects of all these problems from a fresh and unifying perspective that treats users as players in a game, sometimes leading to new insights. At the heart of these analyses are fundamental dualities that have been long studied in the context of cooperative games; for information theoretic purposes, these are dualities between information inequalities on the one hand and properties of rate, capacity, or other resource allocation regions on the other.


arXiv: Information Theory | 2017

Forward and Reverse Entropy Power Inequalities in Convex Geometry

Mokshay M. Madiman; James Melbourne; Peng Xu

The entropy power inequality, which plays a fundamental role in information theory and probability, may be seen as an analogue of the Brunn-Minkowski inequality. Motivated by this connection to Convex Geometry, we survey various recent developments on forward and reverse entropy power inequalities not just for the Shannon-Boltzmann entropy but also more generally for Renyi entropy. In the process, we discuss connections between the so-called functional (or integral) and probabilistic (or entropic) analogues of some classical inequalities in geometric functional analysis.

Collaboration


Dive into the Mokshay M. Madiman's collaboration.

Top Co-Authors

Avatar

Ioannis Kontoyiannis

Athens University of Economics and Business

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Peng Xu

University of Delaware

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jiange Li

University of Delaware

View shared research outputs
Top Co-Authors

Avatar

Prasad Tetali

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge