Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Surya T. Tokdar is active.

Publication


Featured researches published by Surya T. Tokdar.


Biometrika | 2013

Adaptive Bayesian multivariate density estimation with Dirichlet mixtures

Weining Shen; Surya T. Tokdar; Subhashis Ghosal

We show that rate-adaptive multivariate density estimation can be performed using Bayesian methods based on Dirichlet mixtures of normal kernels with a prior distribution on the kernels covariance matrix parameter. We derive sufficient conditions on the prior specification that guarantee convergence to a true density at a rate that is minimax optimal for the smoothness class to which the true density belongs. No prior knowledge of smoothness is assumed. The sufficient conditions are shown to hold for the Dirichlet location mixture-of-normals prior with a Gaussian base measure and an inverse Wishart prior on the covariance matrix parameter. Locally Holder smoothness classes and their anisotropic extensions are considered. Our study involves several technical novelties, including sharp approximation of finitely differentiable multivariate densities by normal mixtures and a new sieve on the space of such densities. Copyright 2013, Oxford University Press.


Bayesian Analysis | 2010

Bayesian density regression with logistic Gaussian process and subspace projection

Surya T. Tokdar; Yu Zhu; Jayanta K. Ghosh

We develop a novel Bayesian density regression model based on logistic Gaussian processes and subspace projection. Logistic Gaussian processes provide an attractive alternative to the popular stick-breaking processes for modeling a family of conditional densities that vary smoothly in the conditioning variable. Subspace projection offers dimension reduction of predictors through multiple linear combinations, offering an alternative to the zeroing out theme of variable selection. We illustrate that logistic Gaussian process and subspace projection combine well to produce a computationally tractable and theoretically sound density regression procedure that offers good out of sample prediction, accurate estimation of subspace projection and satisfactory estimation of subspace dimensionality. We also demonstrate that subspace projection may lead to better prediction than variable selection when predictors are well chosen and possibly dependent on each other, each having a moderate influence on the response.


arXiv: Statistics Theory | 2008

A comparison of the Benjamini-Hochberg procedure with some Bayesian rules for multiple testing

Małgorzata Bogdan; Jayanta K. Ghosh; Surya T. Tokdar

In the spirit of modeling inference for microarrays as multiple testing for sparse mixtures, we present a similar approach to a simplified version of quantitative trait loci (QTL) mapping. Unlike in case of microarrays, where the number of tests usually reaches tens of thousands, the number of tests performed in scans for QTL usually does not exceed several hundreds. However, in typical cases, the sparsity


Bayesian Analysis | 2012

Simultaneous Linear Quantile Regression: A Semiparametric Bayesian Approach

Surya T. Tokdar; Joseph B. Kadane

p


Journal of Multivariate Analysis | 2013

Posterior consistency in conditional distribution estimation

Debdeep Pati; David B. Dunson; Surya T. Tokdar

of significant alternatives for QTL mapping is in the same range as for microarrays. For methodological interest, as well as some related applications, we also consider non-sparse mixtures. Using simulations as well as theoretical observations we study false discovery rate (FDR), power and misclassification probability for the Benjamini-Hochberg (BH) procedure and its modifications, as well as for various parametric and nonparametric Bayes and Parametric Empirical Bayes procedures. Our results confirm the observation of Genovese and Wasserman (2002) that for small p the misclassification error of BH is close to optimal in the sense of attaining the Bayes oracle. This property is shared by some of the considered Bayes testing rules, which in general perform better than BH for large or moderate


Annals of Statistics | 2009

CONSISTENCY OF A RECURSIVE ESTIMATE OF MIXING DISTRIBUTIONS

Surya T. Tokdar; Ryan Martin; Jayanta K. Ghosh

p


Journal of Computational Neuroscience | 2010

Detection of bursts in extracellular spike trains using hidden semi-Markov point process models

Surya T. Tokdar; Peiyi Xi; Ryan C. Kelly; Robert E. Kass

s.


Journal of Computational and Graphical Statistics | 2007

Towards a Faster Implementation of Density Estimation With Logistic Gaussian Process Priors

Surya T. Tokdar

We introduce a semi-parametric Bayesian framework for a simultaneous analysis of linear quantile regression models. A simultaneous analysis is essential to attain the true potential of the quantile regression framework, but is computa- tionally challenging due to the associated monotonicity constraint on the quantile curves. For a univariate covariate, we present a simpler equivalent characterization of the monotonicity constraint through an interpolation of two monotone curves. The resulting formulation leads to a tractable likelihood function and is embedded within a Bayesian framework where the two monotone curves are modeled via lo- gistic transformations of a smooth Gaussian process. A multivariate extension is suggested by combining the full support univariate model with a linear projection of the predictors. The resulting single-index model remains easy to flt and provides substantial and measurable improvement over the flrst order linear heteroscedastic model. Two illustrative applications of the proposed method are provided.


Electronic Journal of Statistics | 2009

Asymptotic properties of predictive recursion: Robustness and rate of convergence

Ryan Martin; Surya T. Tokdar

A wide variety of priors have been proposed for nonparametric Bayesian estimation of conditional distributions, and there is a clear need for theorems providing conditions on the prior for large support, as well as posterior consistency. Estimation of an uncountable collection of conditional distributions across different regions of the predictor space is a challenging problem, which differs in some important ways from density and mean regression estimation problems. Defining various topologies on the space of conditional distributions, we provide sufficient conditions for posterior consistency focusing on a broad class of priors formulated as predictor-dependent mixtures of Gaussian kernels. This theory is illustrated by showing that the conditions are satisfied for a class of generalized stick-breaking process mixtures in which the stick-breaking lengths are monotone, differentiable functions of a continuous stochastic process. We also provide a set of sufficient conditions for the case where stick-breaking lengths are predictor independent, such as those arising from a fixed Dirichlet process prior.


Biostatistics | 2012

A nonparametric empirical Bayes framework for large-scale multiple testing

Ryan Martin; Surya T. Tokdar

Mixture models have received considerable attention recently and Newton [Sankhyā Ser. A 64 (2002) 306-322] proposed a fast recursive algorithm for estimating a mixing distribution. We prove almost sure consistency of this recursive estimate in the weak topology under mild conditions on the family of densities being mixed. This recursive estimate depends on the data ordering and a permutation-invariant modification is proposed, which is an average of the original over permutations of the data sequence. A Rao-Blackwell argument is used to prove consistency in probability of this alternative estimate. Several simulations are presented, comparing the finite-sample performance of the recursive estimate and a Monte Carlo approximation to the permutation-invariant alternative along with that of the nonparametric maximum likelihood estimate and a nonparametric Bayes estimate.

Collaboration


Dive into the Surya T. Tokdar's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ryan Martin

North Carolina State University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Joseph B. Kadane

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Chris Glynn

University of New Hampshire

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Robert E. Kass

Carnegie Mellon University

View shared research outputs
Researchain Logo
Decentralizing Knowledge