Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Divakar Sharma is active.

Publication


Featured researches published by Divakar Sharma.


Communications in Statistics-theory and Methods | 1988

Simultaneous estimation of ordered parameters

Somesh Kumar; Divakar Sharma

The problem of estimating ordered parameters is encountered in biological, agricultural, reliability and various other experiments. Consider two populations with densities f1(x1-ω1) and f2(x2-ω2) where ω1#ω2. The estimation of ω1,ω2) with the loss function, the sum of squared errors, is studied. when fi is the fi(,i,,i 2) density with ,i known, i=1,2; we obtain a class of minimax estimators. When ω1 #ω2 we show some of these estimators are improved by the maximum likelihood estimator. For a general fi we give sufficient conditions for the minimaxity of the analogue of the Pitman estimator.


Communications in Statistics-theory and Methods | 1988

Estimation of the mean of the selected gamma population

P. Vellaisamy; Divakar Sharma

Let л1 and л2 denote two independent gamma populations G(α1, p) and G(α2, p) respectively. Assume α(i=1,2)are unknown and the common shape parameter p is a known positive integer. Let Yi denote the sample mean based on a random sample of size n from the i-th population. For selecting the population with the larger mean, we consider, the natural rule according to which the population corresponding to the larger Yi is selected. We consider? in this paper, the estimation of M, the mean of the selected population. It is shown that the natural estimator is positively biased. We obtain the uniformly minimum variance unbiased estimator(UMVE) of M. We also consider certain subclasses of estikmators of the form c1x(1) +c1x(2) and derive admissible estimators in these classes. The minimazity of certain estimators of interest is investigated. Itis shown that p(p+1)-1x(1) is minimax and dominates the UMVUE. Also UMVUE is not minimax.


Calcutta Statistical Association Bulletin | 1983

Orthogonal Equivariant Minimax Estimatorsof Bivariate Normal Covariance Matrix and Precision Matrix

Divakar Sharma; K. Krishnamoorthy

The scale and orthogonal equivariant minimax estimators are obtained for the bivariate normal covariance matrix and precision matrix under Selliahs (1964} and Steins (1961) loss functions. These new estimators are better than Selliahs and Steins minimax estimators. An unbiased estimator of the risk of the new estimator is obtained under Selliahs loss function using Haffs (1979) identity for the Wishart distribution. Simulation results seem to indicate that the new estimators dominate the corresponding Hatfs [(1979), (1980)] estimators. We also prove that, for p-2, Haffs estimators are not minimax.


Communications in Statistics-theory and Methods | 1989

On the pitman estimator op ordered normal means

Somesh Kumar; Divakar Sharma

Let be independent normal random variables with means and the common variance unity. It is assumed that We show that the Pitman estimator the generalized Bayes estimator of with respect to the uniform prior on is minimax. When k = 2, the components of for estimating θ1and θ2 are minimax (Cohen and Sackrowitz (1970)). However, for k = 3, we prove that the similar result does not hold for θ1 and θ3 Admississibility of in certain classes of estimators is also discussed.


Calcutta Statistical Association Bulletin | 1980

An Estimator of Normal Covariance Matrix

Divakar Sharma

Abstract Solliah (1964), by considering the group of lower triangular matrices, suggested an estimator of the normal covariance matrix Σ when the mean vector is known and the loss function is tr ( Σ − 1   Σ ∧ − I ) 2 His estimator besides being minimax is better than the MLB of Σ. In this paper, we improve upon his estimator and calculate the reduction in the risk when Σ is 2×2.


Statistics & Probability Letters | 1996

A note on estimating quantiles of exponential populations

Somesh Kumar; Divakar Sharma

Independent random samples from k exponential populations with the same location parameter [theta] but different scale parameters [sigma]1, ..., [sigma]k are available. We estimate the quantile [eta]1 = [theta] + b[alpha]1 of the first population with respect to squared error loss. Sharma and Kumar (1994) derived the UMVUE of [eta]1 and then obtained further improvements over it for b > n-1. For 0 [less-than-or-equals, slant] b


Calcutta Statistical Association Bulletin | 1985

Improved Minimax Estimators of Normal Covariance and Precision Matrices from Incomplete Samples

Divakar Sharma; K. Krishnamoorthy

Abstract Let X be an N p (o, ∑) random vector. Suppose besides n observations on X, m observations on the first q(q < p) coordinates are available. Eaton (1970), for this set up, has given a minimax estimator of ∑, which is better than the MLE. We, in this paper, obtain a class of constant risk minimax estimators (Eatons estimator is its member), and hence estimators better than any member of this class. Similar results are derived also for the estimation of ∑-1. The loss functions considered are those of Selliah (1964) and James and Stein (1961) for the estimation of ∑ and an analogue of Steins loss function for the estimation of ∑-1.


Statistics | 1993

Unbiased inestimability of the larger of two parameters

Somesh Kumar; Divakar Sharma

Blumenthal and Cohen (1968c) and Dhariyal, Sharma and Krishnamoorthy (1985) considered the question of the existence of unbiased estimators of the larger (smaller) of the two parameters. In this paper, we first show the non-existence of such an unbiased estimator in case of two double exponential populations with unknown locations. We also give a general inadmissibility result and apply it to the uniform distribution


Statistics | 1995

Estimating the Common Location

Somesh Kumar; Divakar Sharma

Suppose we have independent random samples of sizes m and n respectively from two populations characterized by a common location parameter θ and unknown scale parameters. Let ψ x and ψ y be odd location estimators of θ based on individual samples. Cohen (1976) suggested a combined estimator δ a , which under mild conditions on the density, is unbiased and improves ψ x for all a such that 0 < a ≤ a*(m, n), where a*(m, n) is a constant and m, n denote the two sample sizes. Bhattacharya (1981) enlarged this class by finding a larger upperbound A(m, n) for ‘a’. In this paper, we further improve A(m, n) and establish dominance of δ 1, over both ψ x and ψ y in cases where Bhattacharyas bound does not help. We also consider, an unbiased estimator different from δ a , and determine conditions on c to improve both ψ x and ψ y . Our results also take into account the cases not covered by Akai (1982).


Communications in Statistics-theory and Methods | 1983

Sufficiency of a sufficient statistic for equivariant estimation

Divakar Sharma

Suppose an estimation problem is invariant under a group of transformations and one is interested in finding an optimal equivariant estimator. The usual proactice is to confine attention to non-randomized equivariant estimators based on a minimal sufficient statistic. A justification of this restriction to a smaller clas of estimators is given in this paper under certain conditions.

Collaboration


Dive into the Divakar Sharma's collaboration.

Top Co-Authors

Avatar

Somesh Kumar

Indian Institute of Technology Kharagpur

View shared research outputs
Top Co-Authors

Avatar

P. Vellaisamy

Indian Institute of Technology Bombay

View shared research outputs
Top Co-Authors

Avatar

K. Krishnamoorthy

University of Louisiana at Lafayette

View shared research outputs
Researchain Logo
Decentralizing Knowledge