Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Rohana J. Karunamuni is active.

Publication


Featured researches published by Rohana J. Karunamuni.


Journal of the American Statistical Association | 1999

An Improved Estimator of the Density Function at the Boundary

Shunpu Zhang; Rohana J. Karunamuni; M. C. Jones

Abstract We propose a new method of boundary correction for kernel density estimation. The technique is a kind of generalized reflection method involving reflecting a transformation of the data. The transformation depends on a pilot estimate of the logarithmic derivative of the density at the boundary. In simulations, the new method is seen to clearly outperform an earlier generalized reflection idea. It also has overall advantages over boundary kernel methods and a nonnegative adaptation thereof, although the latter are competitive in some situations. We also present the theory underlying the new methodology.


Journal of Statistical Planning and Inference | 1998

On kernel density estimation near endpoints

Shunpu Zhang; Rohana J. Karunamuni

Abstract In this paper, we consider the estimation problem of f (0), the value of density f at the left endpoint 0. Nonparametric estimation of f (0) is rather formidable due to boundary effects that occur in nonparametric curve estimation. It is well known that the usual kernel density estimates require modifications when estimating the density near endpoints of the support. Here we investigate the local polynomial smoothing technique as a possible alternative method for the problem. It is observed that our density estimator also possesses desirable properties such as automatic adaptability for boundary effects near endpoints. We also obtain an ‘optimal kernel’ in order to estimate the density at endpoints as a solution of a variational problem. Two bandwidth variation schemes are discussed and investigated in a Monte Carlo study.


Journal of Nonparametric Statistics | 2000

On nonparametric density estimation at the boundary

Shunpu Zhang; Rohana J. Karunamuni

Boundary effects are well known to occur in nonparametric density estimation when the support of the density has a finite endpoint. The usual kernel density estimators require modifications when estimating the density near endpoints of the support. In this paper, we propose a new and intuitive method of removing boundary effects in density estimation. Our idea, which replaces the unwanted terms in the bias expansion by their estimators, offers new ways of constructing boundary kernels and optimal endpoint kernels. We also discuss the choice of bandwidth variation functions at the boundary region. The performance of our results are numerically analyzed in a Monte Carlo study.


Annals of the Institute of Statistical Mathematics | 2000

Boundary Bias Correction for Nonparametric Deconvolution

Shunpu Zhang; Rohana J. Karunamuni

In this paper we consider the deconvolution problem in nonparametric density estimation. That is, one wishes to estimate the unknown density of a random variable X, say fX, based on the observed variables Ys, where Y = X + ∈ with ∈ being the error. Previous results on this problem have considered the estimation of fX at interior points. Here we study the deconvolution problem for boundary points. A kernel-type estimator is proposed, and its mean squared error properties, including the rates of convergence, are investigated for supersmooth and ordinary smooth error distributions. Results of a simulation study are also presented.


Journal of Multivariate Analysis | 2012

Efficient Hellinger distance estimates for semiparametric models

Jingjing Wu; Rohana J. Karunamuni

Minimum distance techniques have become increasingly important tools for solving statistical estimation and inference problems. In particular, the successful application of the Hellinger distance approach to fully parametric models is well known. The corresponding optimal estimators, known as minimum Hellinger distance estimators, achieve efficiency at the model density and simultaneously possess excellent robustness properties. For statistical models that are semiparametric, in that they have a potentially infinite dimensional unknown nuisance parameter, minimum distance methods have not been fully studied. In this paper, we extend the Hellinger distance approach to general semiparametric models and study minimum Hellinger distance estimators for semiparametric models. Asymptotic properties such as consistency, asymptotic normality, efficiency and adaptivity of the proposed estimators are investigated. Small sample and robustness properties of the proposed estimators are also examined using a Monte Carlo study. Two real data examples are analyzed as well.


Sequential Analysis | 1992

Empirical bayes sequential estimation of the mean

Rohana J. Karunamuni

We consider the empirical Bayes problem where the component problem is the sequential estimation of the mean of a distribution with squared error decision loss plus a sampling cost. An empirical Bayes sequential estimation procedure is exhibited which is asymptotically optimal. Asymptotic efficiency of the empirical Bayes stopping time sequence is also established. The performance of the proposed empirical Bayes procedure is studied with the help of a Monte Carlo study.


Computational Statistics & Data Analysis | 2011

One-step minimum Hellinger distance estimation

Rohana J. Karunamuni; Jingjing Wu

It is well known now that the minimum Hellinger distance estimation approach introduced by Beran (Beran, R., 1977. Minimum Hellinger distance estimators for parametric models. Ann. Statist. 5, 445-463) produces estimators that achieve efficiency at the model density and simultaneously have excellent robustness properties. However, computational difficulties and algorithmic convergence problems associated with this method have hampered its application in practice, particularly when the method is applied to models with high-dimensional parameter spaces. A one-step minimum Hellinger distance (MHD) procedure is investigated in this paper to overcome computational drawbacks of the fully iterative MHD method. The idea is to start with an initial estimator, and then iterate the Newton-Raphson equation once related to the Hellinger distance. The resulting estimator can be considered a one-step MHD estimator. We show that the proposed one-step MHD estimator has the same asymptotic behavior as the MHD estimator, as long as the initial estimators are reasonably good. Furthermore, our theoretical and numerical studies also demonstrate that the proposed one-step MHD estimator also retains excellent robustness properties of the MHD estimators. A real data example is analyzed as well.


Journal of Nonparametric Statistics | 2010

Boundary performance of the beta kernel estimators

Shunpu Zhang; Rohana J. Karunamuni

The beta kernel estimators are shown in Chen [S.X. Chen, Beta kernel estimators for density functions, Comput. Statist. Data Anal. 31 (1999), pp. 131–145] to be non-negative and have less severe boundary problems than the conventional kernel estimator. Numerical results in Chen [S.X. Chen, Beta kernel estimators for density functions, Comput. Statist. Data Anal. 31 (1999), pp. 131–145] further show that beta kernel estimators have better finite sample performance than some of the widely used boundary corrected estimators. However, our study finds that the numerical comparisons of Chen are confounded with the choice of the bandwidths and the quantities being compared. In this paper, we show that the performances of the beta kernel estimators are very similar to that of the reflection estimator, which does not have the boundary problem only for densities exhibiting a shoulder at the endpoints of the support. For densities not exhibiting a shoulder, we show that the beta kernel estimators have a serious boundary problem and their performances at the boundary are inferior to that of the well-known boundary kernel estimator.


Statistics & Probability Letters | 2003

A semiparametric method of boundary correction for kernel density estimation

Tom Alberts; Rohana J. Karunamuni

We propose a new estimator for boundary correction for kernel density estimation. Our method is based on local Bayes techniques of Hjort (Bayesian Statist. 5 (1996) 223). The resulting estimator is semiparametric type estimator: a weighted average of an initial guess and the ordinary reflection method estimator. The proposed estimator is seen to perform quite well compared to other existing well-known estimators for densities which have the shoulder condition at the endpoints.


Statistics & Probability Letters | 1990

Improvements on strong uniform consistency of some known kernel estimates of a density and its derivatives

Rohana J. Karunamuni; K.L. Mehra

Some known kernel type estimates of a density and its derivatives f(p) are considered. Strong uniform consistency properties over the whole real line are studied. Improved rate of convergence results are established under substantially weaker smoothness assumptions of f(p), p [greater-or-equal, slanted] 0. A new bias reduction technique is presented based on Bernsteins polynomials and notions and relations in calculous of finite differences.

Collaboration


Dive into the Rohana J. Karunamuni's collaboration.

Top Co-Authors

Avatar

Shunpu Zhang

University of Nebraska–Lincoln

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Qingguo Tang

Nanjing University of Science and Technology

View shared research outputs
Top Co-Authors

Avatar

Tom Alberts

California Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

JunJie Wu

University of Alberta

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Pengfei Li

University of Waterloo

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge