Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Yaming Yu is active.

Publication


Featured researches published by Yaming Yu.


Journal of Computational and Graphical Statistics | 2011

To Center or Not to Center: That Is Not the Question—An Ancillarity–Sufficiency Interweaving Strategy (ASIS) for Boosting MCMC Efficiency

Yaming Yu; Xiao-Li Meng

For a broad class of multilevel models, there exist two well-known competing parameterizations, the centered parameterization (CP) and the non-centered parameterization (NCP), for effective MCMC implementation. Much literature has been devoted to the questions of when to use which and how to compromise between them via partial CP/NCP. This article introduces an alternative strategy for boosting MCMC efficiency via simply interweaving—but not alternating—the two parameterizations. This strategy has the surprising property that failure of both the CP and NCP chains to converge geometrically does not prevent the interweaving algorithm from doing so. It achieves this seemingly magical property by taking advantage of the discordance of the two parameterizations, namely, the sufficiency of CP and the ancillarity of NCP, to substantially reduce the Markovian dependence, especially when the original CP and NCP form a “beauty and beast” pair (i.e., when one chain mixes far more rapidly than the other). The ancillarity–sufficiency reformulation of the CP–NCP dichotomy allows us to borrow insight from the well-known Basu’s theorem on the independence of (complete) sufficient and ancillary statistics, albeit a Bayesian version of Basu’s theorem is currently lacking. To demonstrate the competitiveness and versatility of this ancillarity–sufficiency interweaving strategy (ASIS) for real-world problems, we apply it to fit (1) a Cox process model for detecting changes in source intensity of photon counts observed by the Chandra X-ray telescope from a (candidate) neutron/quark star, which was the problem that motivated the ASIS strategy as it defeated other methods we initially tried; (2) a probit model for predicting latent membranous lupus nephritis; and (3) an interval-censored normal model for studying the lifetime of fluorescent lights. A bevy of open questions are presented, from the mysterious but exceedingly suggestive connections between ASIS and fiducial/structural inferences to nested ASIS for further boosting MCMC efficiency. This article has supplementary material online.


Annals of Statistics | 2010

Monotonic convergence of a general algorithm for computing optimal designs

Yaming Yu

Monotonic convergence is established for a general class of multiplicative algorithms introduced by Silvey, Titterington and Torsney [Comm. Statist. Theory Methods 14 (1978) 1379--1389] for computing optimal designs. A conjecture of Titterington [Appl. Stat. 27 (1978) 227--234] is confirmed as a consequence. Optimal designs for logistic regression are used as an illustration.


Statistics and Computing | 2011

D-optimal designs via a cocktail algorithm

Yaming Yu

A fast new algorithm is proposed for numerical computation of (approximate) D-optimal designs. This cocktail algorithm extends the well-known vertex direction method (VDM; Fedorov in Theory of Optimal Experiments, 1972) and the multiplicative algorithm (Silvey et al. in Commun. Stat. Theory Methods 14:1379–1389, 1978), and shares their simplicity and monotonic convergence properties. Numerical examples show that the cocktail algorithm can lead to dramatically improved speed, sometimes by orders of magnitude, relative to either the multiplicative algorithm or the vertex exchange method (a variant of VDM). Key to the improved speed is a new nearest neighbor exchange strategy, which acts locally and complements the global effect of the multiplicative algorithm. Possible extensions to related problems such as nonparametric maximum likelihood estimation are mentioned.


IEEE Transactions on Information Theory | 2010

Monotonicity, Thinning, and Discrete Versions of the Entropy Power Inequality

Oliver Johnson; Yaming Yu

We consider the entropy of sums of independent discrete random variables, in analogy with Shannons Entropy Power Inequality, where equality holds for normals. In our case, infinite divisibility suggests that equality should hold for Poisson variables. We show that some natural analogues of the EPI do not in fact hold, but propose an alternative formulation which does always hold. The key to many proofs of Shannons EPI is the behavior of entropy on scaling of continuous random variables. We believe that Rényis operation of thinning discrete random variables plays a similar role to scaling, and give a sharp bound on how the entropy of ultra log-concave random variables behaves on thinning. In the spirit of the monotonicity results established by Artstein, Ball, Barthe, and Naor, we prove a stronger version of concavity of entropy, which implies a strengthened form of our discrete EPI.


IEEE Transactions on Information Theory | 2010

Sharp Bounds on the Entropy of the Poisson Law and Related Quantities

José A. Adell; Alberto Lekuona; Yaming Yu

One of the difficulties in calculating the capacity of certain Poisson channels is that H(¿), the entropy of the Poisson distribution with mean ¿, is not available in a simple form. In this paper, we derive upper and lower bounds for H(¿) that are asymptotically tight and easy to compute. The derivation of such bounds involves only simple probabilistic and analytic tools. This complements the asymptotic expansions of Knessl (1998), Jacquet and Szpankowski (1999), and Flajolet (1999). The same method yields tight bounds on the relative entropy D(n, p) between a binomial and a Poisson, thus refining the work of Harremoe¿s and Ruzankin (2004). Bounds on the entropy of the binomial also follow easily.


IEEE Transactions on Information Theory | 2009

On the Entropy of Compound Distributions on Nonnegative Integers

Yaming Yu

Some entropy comparison results are presented concerning compound distributions on nonnegative integers. The main result shows that, under a log-concavity assumption, two compound distributions are ordered in terms of Shannon entropy if both the ldquonumbers of claimsrdquo and the ldquoclaim sizesrdquo are ordered accordingly in the convex order. Several maximum/minimum entropy theorems follow as a consequence. Most importantly, two recent results of Johnson (2008) on maximum entropy characterizations of compound Poisson and compound binomial distributions are proved under fewer assumptions and with simpler arguments.


international symposium on information theory | 2009

Concavity of entropy under thinning

Yaming Yu; Oliver Johnson

Building on the recent work of Johnson (2007) and Yu (2008), we prove that entropy is a concave function with respect to the thinning operation T<inf>α</inf>. That is, if X and Y are independent random variables on Z<inf>+</inf> with ultra-log-concave probability mass functions, then H(T<inf>α</inf>X + T<inf>1-α</inf>Y) ≥ αH(X) + (1 - α)H(Y), 0 ≤ α ≤ 1, where H denotes the discrete entropy. This is a discrete analogue of the inequality (h denotes the differential entropy) h(√αX + √1 - αY ) ≥ αh(X) + (1 - α)h(Y), 0 ≤ α ≤ 1, which holds for continuous X and Y with finite variances and is equivalent to Shannons entropy power inequality. As a consequence we establish a special case of a conjecture of Shepp and Olkin (1981). Possible extensions are also discussed.


Bernoulli | 2011

Some stochastic inequalities for weighted sums

Yaming Yu

We compare weighted sums of i.i.d. positive random variables according to the usual stochastic order. The main inequalities are derived using majorization techniques under certain log-concavity assumptions. Specifically, let


IEEE Transactions on Information Theory | 2010

Squeezing the Arimoto–Blahut Algorithm for Faster Convergence

Yaming Yu

Y_i


Bernoulli | 2010

Relative log-concavity and a pair of triangle inequalities

Yaming Yu

be i.i.d. random variables on

Collaboration


Dive into the Yaming Yu's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Xiao-Li Meng

University of California

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Syed Ali Jafar

University of California

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Erwan Hillion

University of Luxembourg

View shared research outputs
Researchain Logo
Decentralizing Knowledge