Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Sergey G. Bobkov is active.

Publication


Featured researches published by Sergey G. Bobkov.


Geometric and Functional Analysis | 2000

From Brunn-Minkowski to Brascamp-Lieb and to logarithmic Sobolev inequalities

Sergey G. Bobkov; Michel Ledoux

Abstract. We develop several applications of the Brunn—Minkowski inequality in the Prékopa—Leindler form. In particular, we show that an argument of B. Maurey may be adapted to deduce from the Prékopa—Leindler theorem the Brascamp—Lieb inequality for strictly convex potentials. We deduce similarly the logarithmic Sobolev inequality for uniformly convex potentials for which we deal more generally with arbitrary norms and obtain some new results in this context. Applications to transportation cost and to concentration on uniformly convex bodies complete the exposition.


Annals of Probability | 2009

WEIGHTED POINCARÉ-TYPE INEQUALITIES FOR CAUCHY AND OTHER CONVEX MEASURES

Sergey G. Bobkov; Michel Ledoux

Brascamp-Lieb-type, weighted Poincare-type and related analytic inequalities are studied for multidimensional Cauchy distributions and more general κ-concave probability measures (in the hierarchy of convex measures). In analogy with the limiting (infinite-dimensional log-concave) Gaussian model, the weighted inequalities fully describe the measure concentration and large deviation properties of this family of measures. Cheeger-type isoperimetric inequalities are investigated similarly, giving rise to a common weight in the class of concave probability measures under consideration.


Journal of Functional Analysis | 2012

Reverse Brunn-Minkowski and reverse entropy power inequalities for convex measures

Sergey G. Bobkov; Mokshay M. Madiman

We develop a reverse entropy power inequality for convex measures, which may be seen as an anegeometric inverse of the entropy power inequality of Shannon and Stam. The specialization of this inequality to log-concave measures may be seen as a version of Milman’s reverse Brunn-Minkowski inequality. The proof relies on a demonstration of new relationships between the entropy of high dimensional random vectors and the volume of convex bodies, and on a study of eective supports of convex measures, both of which are of independent interest, as well as on Milman’s deep technology of M-ellipsoids and on certain information-theoretic inequalities. As a by-product, we also give a


IEEE Transactions on Information Theory | 2011

The Entropy Per Coordinate of a Random Vector is Highly Constrained Under Convexity Conditions

Sergey G. Bobkov; Mokshay M. Madiman

The entropy per coordinate in a log-concave random vector of any dimension with given density at the mode is shown to have a range of just 1. Uniform distributions on convex bodies are at the lower end of this range, the distribution with i.i.d. exponentially distributed coordinates is at the upper end, and the normal is exactly in the middle. Thus, in terms of the amount of randomness as measured by entropy per coordinate, any log-concave random vector of any dimension contains randomness that differs from that in the normal random variable with the same maximal density value by at most 1/2. As applications, we obtain an information-theoretic formulation of the famous hyperplane conjecture in convex geometry, entropy bounds for certain infinitely divisible distributions, and quantitative estimates for the behavior of the density at the mode on convolution. More generally, one may consider so-called convex or hyperbolic probability measures on Euclidean spaces; we give new constraints on entropy per coordinate for this class of measures, which generalize our results under the log-concavity assumption, expose the extremal role of multivariate Pareto-type distributions, and give some applications.


Archive | 2003

On the Central Limit Property of Convex Bodies

Sergey G. Bobkov; Alexander Koldobsky

For isotropic convex bodies K in R n with isotropic constant L K , we study the rate of convergence, as n goes to infinity, of the average volume of sections of K to the Gaussian density on the line with variance L K 2.


Annals of Probability | 2013

Rate of convergence and edgeworth-type expansion in the entropic central limit theorem

Sergey G. Bobkov; G. P. Chistyakov; F. Götze

An Edgeworth-type expansion is established for the entropy distance to the class of normal distributions of sums of i.i.d. random variables or vectors, satisfying minimal moment conditions.


Bernoulli | 2001

On Gaussian and Bernoulli covariance representations

Sergey G. Bobkov; F. Götze; Christian Houdré

We discuss several applications, to large deviations for smooth functions of Gaussian random vectors, of a covariance representation in Gauss space. The existence of this type of representation characterizes Gaussian measures. New representations for Bernoulli measures are also derived, recovering some known inequalities.


Annals of Probability | 2011

Concentration of the information in data with log-concave distributions

Sergey G. Bobkov; Mokshay M. Madiman

on which the distribution of Xis supported. In this case, eh(X) is essentially the number of bits neededto represent X by a coding scheme that minimizes average code length([Sha48]). In the continuous case (with reference measure dx), one may stillcall eh(X) the information content even though the coding interpretation nolonger holds. In statistics, one may think of the information content as thelog likelihood function.The average value of the information content of Xis known more com-monly as the entropy. Indeed, the entropy of Xis defined byh(X) = −Zf(x)logf(x)dx= −Elogf(X).


IEEE Transactions on Information Theory | 2015

Entropy Power Inequality for the Rényi Entropy

Sergey G. Bobkov; G. P. Chistyakov

The classical entropy power inequality is extended to the Rényi entropy. We also discuss the question of the existence of the entropy for sums of independent random variables.


Archive | 2003

Large Deviations of Typical Linear Functionals on a Convex Body with Unconditional Basis

Sergey G. Bobkov; Fedor Nazarov

We study large deviations of linear functionals on an isotropic convex set with unconditional basis. It is shown that suitably normalized l 1-balls play the role of extremal bodies.

Collaboration


Dive into the Sergey G. Bobkov's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Prasad Tetali

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Michel Ledoux

Institut Universitaire de France

View shared research outputs
Top Co-Authors

Avatar

Christian Houdré

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Arnaud Marsiglietti

California Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge