Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Ayanendranath Basu is active.

Publication


Featured researches published by Ayanendranath Basu.


Annals of the Institute of Statistical Mathematics | 1994

Minimum disparity estimation for continuous models: Efficiency, distributions and robustness

Ayanendranath Basu; Bruce G. Lindsay

A general class of minimum distance estimators for continuous models called minimum disparity estimators are introduced. The conventional technique is to minimize a distance between a kernel density estimator and the model density. A new approach is introduced here in which the model and the data are smoothed with the same kernel. This makes the methods consistent and asymptotically normal independently of the value of the smoothing parameter; convergence properties of the kernel density estimate are no longer necessary. All the minimum distance estimators considered are shown to be first order efficient provided the kernel is chosen appropriately. Different minimum disparity estimators are compared based on their characterizing residual adjustment function (RAF); this function shows that the robustness features of the estimators can be explained by the shrinkage of certain residuals towards zero. The value of the second derivative of theRAF at zero,A2, provides the trade-off between efficiency and robustness. The above properties are demonstrated both by theorems and by simulations.


Journal of the American Statistical Association | 1998

Weighted likelihood equations with bootstrap root search

Marianthi Markatou; Ayanendranath Basu; Bruce G. Lindsay

Abstract We discuss a method of weighting likelihood equations with the aim of obtaining fully efficient and robust estimators. We discuss the case of continuous probability models using unimodal weighting functions. These weighting functions downweight observations that are inconsistent with the assumed model. At the true model, therefore, the proposed estimating equations behave like the ordinary likelihood equations. We investigate the number of solutions of the estimating equations via a bootstrap root search; the estimators obtained are consistent and asymptotically normal and have desirable robustness properties. An extensive simulation study and real data examples illustrate the operating characteristics of the proposed methodology.


Handbook of Statistics | 1997

2 Minimum distance estimation: The approach using density-based distances

Ayanendranath Basu; Ian R. Harris; Srabashi Basu

Publisher Summary This chapter discusses the concept of minimum distance estimation using density-based distances. Density-based minimum distance methods have proven to be valuable additions to the theory of statistics as demonstrated by the rich literature of the past two decades. In parametric models, the estimators often possess full asymptotic efficiency simultaneously with attractive robustness properties. The chapter also discusses minimum Hellinger distance estimation, including the Hellinger deviance test and penalized Hellinger distance estimation. In the chapter, General disparities, residual adjustment functions, and related inference are introduced and the negative exponential disparity and weighted likelihood estimators (including linear regression models) are described. A generalized divergence measure and the resulting estimators are also discussed in the chapter.


Journal of Statistical Planning and Inference | 1997

Minimum negative exponential disparity estimation in parametric models

Ayanendranath Basu; Sahadeb Sarkar; A.N. Vidyashankar

Abstract Works of Lindsay (1994) and Basu and Sarkar (1994a) provide heuristic arguments and some empirical evidence that the minimum negative exponential disparity estimator (MNEDE), like the minimum Hellinger distance estimator (MHDE) (Beran, 1977), is a robust alternative to the usual maximum likelihood estimator when data contain outliers. In this paper we establish the robustness properties of the MNEDE and prove that it is asymptotically fully efficient under a specified regular parametric family of densities. Also our simulation results show that unlike the MHDE the MNEDE is robust not only against outliers, but also against inliers, defined as values with less data than expected.


Journal of Statistical Computation and Simulation | 1994

The trade-off between robustness and efficiency and the effect of model smoothing in minimum disparity inference

Ayanendranath Basu; Sahadeb Sarkar

Through an empirical study at the normal model it is shown that the curvature parameter of the residual adjustment function (Lindsay 1994) is not always an adequate global measure of the trade-off between robustness and efficiency of the minimum disparity estimators. Our study shows that the estimator obtained by minimizing the negative exponential disparity is an attractive robust estimator with good efficiency properties. Smoothing the model with the same kernel used to determine the nonparametric density estimator results in higher efficiency for the minimum disparity estimators, especially for the estimator of the scale parameter. In addition the disparity tests (including the negative exponential disparity test) are shown to be good robust alternatives to the likelihood ratio test at the normal model.


Communications in Statistics - Simulation and Computation | 1994

Hellinger distance as a penalized log likelihood

Ian R. Harris; Ayanendranath Basu

The present paper studies the minimum Hellinger distance estimator by recasting it as the maximum likelihood estimator in a data driven modification of the model density. In the process, the Hellinger distance itself is expressed as a penalized log likelihood function. The penalty is the sum of the model probabilities over the non-observed values of the sample space. A comparison of the modified model density with the original data provides insights into the robustness of the minimum Hellinger distance estimator. Adjustments of the amount of penalty leads to a class of minimum penalized Hellinger distance estimators, some members of which perform substantially better than the minimum Hellinger distance estimator at the model for small samples, without compromising the robustness properties of the latter.


Bernoulli | 2017

A generalized divergence for statistical inference

Abhik Ghosh; Ian R. Harris; Avijit Maji; Ayanendranath Basu; Leandro Pardo

The power divergence (PD) and the density power divergence (DPD) families have proven to be useful tools in the area of robust inference. In this paper, we consider a superfamily of divergences which contains both of these families as special cases. The role of this superfamily is studied in several statistical applications, and desirable properties are identified and discussed. In many cases, it is observed that the most preferred minimum divergence estimator within the above collection lies outside the class of minimum PD or minimum DPD estimators, indicating that this superfamily has real utility, rather than just being a routine generalization. The limitation of the usual first order influence function as an effective descriptor of the robustness of the estimator is also demonstrated in this connection.


Statistics | 2016

Generalized Wald-type tests based on minimum density power divergence estimators

Ayanendranath Basu; Abhijit Mandal; Nirian Martín; Leandro Pardo

In testing of hypothesis, the robustness of the tests is an important concern. Generally, the maximum likelihood-based tests are most efficient under standard regularity conditions, but they are highly non-robust even under small deviations from the assumed conditions. In this paper, we have proposed generalized Wald-type tests based on minimum density power divergence estimators for parametric hypotheses. This method avoids the use of nonparametric density estimation and the bandwidth selection. The trade-off between efficiency and robustness is controlled by a tuning parameter β. The asymptotic distributions of the test statistics are chi-square with appropriate degrees of freedom. The performance of the proposed tests is explored through simulations and real data analysis.


Journal of Statistical Planning and Inference | 2015

On the robustness of a divergence based test of simple statistical hypotheses

Abhik Ghosh; Ayanendranath Basu; Leandro Pardo

The most popular hypothesis testing procedure, the likelihood ratio test, is known to be highly non-robust in many real situations. Basu et al. (2013a) provided an alternative robust procedure of hypothesis testing based on the density power divergence; however, although the robustness properties of the latter test were intuitively argued for by the authors together with extensive empirical substantiation of the same, no theoretical robustness properties were presented in this work. In the present paper we will consider a more general class of tests which form a superfamily of the procedures described by Basu et al. (2013a). This superfamily derives from the class of S-divergences recently proposed by Basu et al. (2013a). In this context we theoretically prove several robustness results of the new class of tests and illustrate them in the normal model. All the theoretical robustness properties of the Basu et al. (2013a) proposal follows as special cases of our results.


Journal of Statistical Computation and Simulation | 2003

The generalized kullback-leibler divergence and robust inference

Chanseok Park; Ayanendranath Basu

This paper examines robust techniques for estimation and tests of hypotheses using the family of generalized Kullback-Leibler (GKL) divergences. The GKL family is a new group of density based divergences which forms a subclass of disparities defined by Lindsay (1994). We show that the corresponding minimum divergence estimators have a breakdown point of 50% under the model. The performance of the proposed estimators and tests are investigated through an extensive numerical study involving real-data examples and simulation results. The results show that the proposed methods are attractive choices for highly efficient and robust methods.

Collaboration


Dive into the Ayanendranath Basu's collaboration.

Top Co-Authors

Avatar

Abhik Ghosh

Indian Statistical Institute

View shared research outputs
Top Co-Authors

Avatar

Leandro Pardo

Complutense University of Madrid

View shared research outputs
Top Co-Authors

Avatar

Abhijit Mandal

Indian Statistical Institute

View shared research outputs
Top Co-Authors

Avatar

Srabashi Basu

Indian Statistical Institute

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Srabashi Basu

Indian Statistical Institute

View shared research outputs
Top Co-Authors

Avatar

Ian R. Harris

Southern Methodist University

View shared research outputs
Top Co-Authors

Avatar

Subir Kumar Bhandari

Indian Statistical Institute

View shared research outputs
Top Co-Authors

Avatar

Nirian Martín

Complutense University of Madrid

View shared research outputs
Top Co-Authors

Avatar

Avijit Maji

Indian Statistical Institute

View shared research outputs
Researchain Logo
Decentralizing Knowledge