Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Sangwoo Park is active.

Publication


Featured researches published by Sangwoo Park.


IEEE Signal Processing Magazine | 2013

Gaussian Assumption: The Least Favorable but the Most Useful [Lecture Notes]

Sangwoo Park; Erchin Serpedin; Khalid A. Qaraqe

Gaussian assumption is the most well-known and widely used distribution in many fields such as engineering, statistics, and physics. One of the major reasons why the Gaussian distribution has become so prominent is because of the central limit theorem (CLT) and the fact that the distribution of noise in numerous engineering systems is well captured by the Gaussian distribution. Moreover, features such as analytical tractability and easy generation of other distributions from the Gaussian distribution contributed further to the popularity of Gaussian distribution. Especially, when there is no information about the distribution of observations, Gaussian assumption appears as the most conservative choice. This follows from the fact that the Gaussian distribution minimizes the Fisher information, which is the inverse of the Cramer-Rao lower bound (CRLB) (or equivalently stated, the Gaussian distribution maximizes the CRLB). Therefore, any optimization based on the CRLB under the Gaussian assumption can be considered to be min-max optimal in the sense of minimizing the largest CRLB (see [1] and the references cited therein).


IEEE Transactions on Information Theory | 2012

On the Equivalence Between Stein and De Bruijn Identities

Sangwoo Park; Erchin Serpedin; Khalid A. Qaraqe

This paper focuses on illustrating 1) the equivalence between Steins identity and De Bruijns identity, and 2) two extensions of De Bruijns identity. First, it is shown that Steins identity is equivalent to De Bruijns identity under additive noise channels with specific conditions. Second, for arbitrary but fixed input and noise distributions under additive noise channels, the first derivative of the differential entropy is expressed by a function of the posterior mean, and the second derivative of the differential entropy is expressed in terms of a function of Fisher information. Several applications over a number of fields, such as signal processing and information theory, are presented to support the usefulness of the developed results in this paper.


IEEE Transactions on Information Theory | 2013

A Unifying Variational Perspective on Some Fundamental Information Theoretic Inequalities

Sangwoo Park; Erchin Serpedin; Khalid A. Qaraqe

This paper proposes a unifying variational approach for proving and extending some fundamental information theoretic inequalities. Fundamental information theory results such as maximization of differential entropy, minimization of Fisher information (Cramér-Rao inequality), worst additive noise lemma, entropy power inequality, and extremal entropy inequality are interpreted as functional problems and proved within the framework of calculus of variations. Several applications and possible extensions of the proposed results are briefly mentioned.


international workshop on signal processing advances in wireless communications | 2012

New perspectives, extensions and applications of de Bruijn identity

Sangwoo Park; Erchin Serpedin; Khalid A. Qaraqe

This paper focuses on de Bruijn identity, a fundamental result that relates two important concepts: Fisher information and differential entropy. A novel relationship with Stein identity and extensions of de Bruijn identity are first presented. Then several applications of de Bruijn identity in deriving the Bayesian Cramér-Rao lower bound (BCRLB), Cramér-Rao lower bound (CRLB), and a new lower bound, tighter than the BCRLB, are presented. The paper concludes with an application of de Bruijn identity in designing min-max optimal training sequences for channel estimation and synchronization in the presence of unknown noise distribution.


international symposium on information theory | 2013

A variational perspective over an extremal entropy inequality

Sangwoo Park; Erchin Serpedin; Marwa Qaraqe

This paper proposes a novel variational approach for proving the extremal entropy inequality (EEI) [1]. Unlike previous proofs [1], [2], the proposed variational approach is simpler and it does not require neither the classical entropy power inequality (EPI) [1], [2] nor the channel enhancement technique [1]. The proposed approach is versatile and can be easily adapted to numerous other applications such as proving or extending other fundamental information theoretic inequalities such as the EPI, worst additive noise lemma, and Cramér-Rao inequality.


international symposium on information theory | 2012

On the equivalence between Stein identity and de Bruijn identity

Sangwoo Park; Erchin Serpedin; Khalid A. Qaraqe

This paper illustrates the equivalence between two fundamental results: Stein identity, originally proposed in the statistical estimation realm, and de Bruijn identity, considered for the first time in the information theory field. Two distinctive extensions of de Bruijn identity are presented as well. For arbitrary but fixed input and noise distributions, the first-order derivative of differential entropy is expressed by means of a function of the posterior mean, while the second-order derivative of differential entropy is manifested in terms of a function of Fisher information. Several applications exemplify the utility of the proposed results.


international symposium on information theory | 2012

An information theoretic perspective over an extremal entropy inequality

Sangwoo Park; Erchin Serpedin; Khalid A. Qaraqe

This paper focuses on developing an alternative proof for an extremal entropy inequality, originally presented in [1]. The proposed alternative proof is simply based on the classical entropy power inequality and the data processing inequality. Compared with the proofs in [1], the proposed alternative proof is simpler, more direct, and information theoretic, and presents the advantage of providing the structure of the optimal solution covariance matrix. Also, the proposed proof might also be used as a novel method to address applications such as calculation of the vector Gaussian broadcast channel capacity, establishing a lower bound for the achievable rate of distributed source coding with a single quadratic distortion constraint, and the secrecy capacity of the Gaussian wire-tap channel.


arXiv: Information Theory | 2012

An Alternative Proof of an Extremal Inequality

Sangwoo Park; Erchin Serpedin; Khalid A. Qaraqe


arXiv: Information Theory | 2012

An Alternative Proof of an Extremal Entropy Inequality

Sangwoo Park; Erchin Serpedin; Khalid A. Qaraqe


IEEE Transactions on Information Theory | 2016

Correction to “A Unifying Variational Perspective on Some Fundamental Information Theoretic Inequalities”

Sangwoo Park; Erchin Serpedin; Khalid A. Qaraqe

Collaboration


Dive into the Sangwoo Park's collaboration.

Researchain Logo
Decentralizing Knowledge