James Melbourne
University of Minnesota
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by James Melbourne.
arXiv: Information Theory | 2017
Mokshay M. Madiman; James Melbourne; Peng Xu
The entropy power inequality, which plays a fundamental role in information theory and probability, may be seen as an analogue of the Brunn-Minkowski inequality. Motivated by this connection to Convex Geometry, we survey various recent developments on forward and reverse entropy power inequalities not just for the Shannon-Boltzmann entropy but also more generally for Renyi entropy. In the process, we discuss connections between the so-called functional (or integral) and probabilistic (or entropic) analogues of some classical inequalities in geometric functional analysis.
international symposium on information theory | 2017
Peng Xu; James Melbourne; Mokshay M. Madiman
We develop a general notion of rearrangement for certain metric groups, and prove a Hardy-Littlewood type inequality. Combining this with a characterization of the extreme points of the set of probability measures with bounded densities with respect to a reference measure, we establish a general min-entropy inequality for convolutions. Special attention is paid to the integers where a min-entropy power inequality is conjectured and a partial result proved.
Probability Surveys | 2016
Sergey G. Bobkov; James Melbourne
Localization and dilation procedures are discussed for infinite dimensional �-concave measures on abstract locally convex spaces (following Borell’s hierarchy of hyperbolic measures).
international symposium on information theory | 2017
Peng Xu; James Melbourne; Mokshay M. Madiman
An optimal ∞-Rényi entropy power inequality is derived for d-dimensional random vectors. In fact, the authors establish a matrix ∞-EPI analogous to the generalization of the classical EPI established by Zamir and Feder. The result is achieved by demonstrating uniform distributions as extremizers of a certain class of ∞-Rényi entropy inequalities, and then putting forth a new rearrangement inequality for the ∞-Rényi entropy. Quantitative results are then derived as consequences of a new geometric inequality for uniform distributions on Euclidean balls.
Entropy | 2018
Saurav Talukdar; Shreyas Bhaban; James Melbourne; Murti V. Salapaka
This article analyzes the effect of imperfections in physically realizable memory. Motivated by the realization of a bit as a Brownian particle within a double well potential, we investigate the energetics of an erasure protocol under a Gaussian mixture model. We obtain sharp quantitative entropy bounds that not only give rigorous justification for heuristics utilized in prior works, but also provide a guide toward the minimal scale at which an erasure protocol can be performed. We also compare the results obtained with the mean escape times from double wells to ensure reliability of the memory. The article quantifies the effect of overlap of two Gaussians on the the loss of interpretability of the state of a one bit memory, the required heat dissipated in partially successful erasures and reliability of information stored in a memory bit.
arXiv: Probability | 2017
Mokshay M. Madiman; James Melbourne; Peng Xu
international symposium on information theory | 2016
Peng Xu; James Melbourne; Mokshay M. Madiman
arXiv: Statistical Mechanics | 2018
Saurav Talukdar; Shreyas Bhaban; James Melbourne; Murti V. Salapaka
Doklady Mathematics | 2015
Sergey G. Bobkov; James Melbourne
IEEE Transactions on Information Theory | 2018
Arnaud Marsiglietti; James Melbourne