Abhishek Bhattacharya
Indian Statistical Institute
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Abhishek Bhattacharya.
arXiv: Geometric Topology | 2008
Abhishek Bhattacharya
This article presents certain recent methodologies and some new results for the statistical analysis of probability distributions on manifolds. An important example considered in some detail here is the 2-D shape space of k-ads, comprising all configurations of k planar landmarks (k > 2)-modulo translation, scaling and rotation.
Proceedings of the American Mathematical Society | 2008
Abhishek Bhattacharya; Rabi Bhattacharya
In this article a nonsingular asymptotic distribution is derived for a broad class of underlying distributions on a Riemannian manifold in relation to its curvature. Also, the asymptotic dispersion is explicitly related to curvature. These results are applied and further strengthened for the planar shape space of k-ads.
Journal of Multivariate Analysis | 2012
Abhishek Bhattacharya; David B. Dunson
Our first focus is prediction of a categorical response variable using features that lie on a general manifold. For example, the manifold may correspond to the surface of a hypersphere. We propose a general kernel mixture model for the joint distribution of the response and predictors, with the kernel expressed in product form and dependence induced through the unknown mixing measure. We provide simple sufficient conditions for large support and weak and strong posterior consistency in estimating both the joint distribution of the response and predictors and the conditional distribution of the response. Focusing on a Dirichlet process prior for the mixing measure, these conditions hold using von Mises-Fisher kernels when the manifold is the unit hypersphere. In this case, Bayesian methods are developed for efficient posterior computation using slice sampling. Next we develop Bayesian nonparametric methods for testing whether there is a difference in distributions between groups of observations on the manifold having unknown densities. We prove consistency of the Bayes factor and develop efficient computational methods for its calculation. The proposed classification and testing methods are evaluated using simulation examples and applied to spherical data applications.
Archive | 2009
Rabi Bhattacharya; Abhishek Bhattacharya
This article provides an exposition of recent developments on the analysis of landmark based shapes in which a k-ad, i.e., a set of k points or landmarks on an object or a scene, are observed in 2D or 3D, for purposes of identification, discrimination, or diagnostics. Depending on the way the data are collected or recorded, the appropriate shape of an object is the maximal invariant specified by the space of orbits under a group G of transformations. All these spaces are manifolds, often with natural Riemannian structures. The statistical analysis based on Riemannian structures is said to be intrinsic. In other cases, proper distances are sought via an equivariant embedding of the manifold M in a vector space E, and the corresponding statistical analysis is called extrinsic.
Journal of the American Statistical Association | 2013
Garritt L. Page; Abhishek Bhattacharya; David B. Dunson
It has become common for datasets to contain large numbers of variables in studies conducted in areas such as genetics, machine vision, image analysis, and many others. When analyzing such data, parametric models are often too inflexible while nonparametric procedures tend to be nonrobust because of insufficient data on these high-dimensional spaces. This is particularly true when interest lies in building efficient classifiers in the presence of many predictor variables. When dealing with these types of data, it is often the case that most of the variability tends to lie along a few directions, or more generally along a much smaller dimensional submanifold of the data space. In this article, we propose a class of models that flexibly learn about this submanifold while simultaneously performing dimension reduction in classification. This methodology allows the cell probabilities to vary nonparametrically based on a few coordinates expressed as linear combinations of the predictors. Also, as opposed to many black-box methods for dimensionality reduction, the proposed model is appealing in having clearly interpretable and identifiable parameters that provide insight into which predictors are important in determining accurate classification boundaries. Gibbs sampling methods are developed for posterior computation, and the methods are illustrated using simulated and real data applications.
Biometrika | 2010
Abhishek Bhattacharya; David B. Dunson
Archive | 2012
Abhishek Bhattacharya; Rabi Bhattacharya
Annals of the Institute of Statistical Mathematics | 2012
Abhishek Bhattacharya; David B. Dunson
Archive | 2010
David B. Dunson; Abhishek Bhattacharya
arXiv: Methodology | 2011
Abhishek Bhattacharya; Garritt L. Page; David B. Dunson