Yoshihiko Maesono
Kyushu University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Yoshihiko Maesono.
Journal of Statistical Planning and Inference | 1997
Yoshihiko Maesono
Abstract We obtain the H-decomposition of a jackknife estimator of the variance of U-statistic and derive an Edgeworth expansion with remainder term o (n −1 2 ) . We also obtain an approximation of a studentized U-statistic substituting the jackknife estimator of the variance. An Edgeworth expansion with remainder term o(n−1) is established for the studentized U-statistic with arbitrary degree of the kernel.
Annals of the Institute of Statistical Mathematics | 1987
Yoshihiko Maesono
SummaryDistribution-free statistics are proposed for one-sample location test, and are compared with the Wilcoxon signed rank test. It is shown that one of the statistics is superior to the Wilcoxon test in terms of approximate Bahadur efficiency. And we compare that statistic with the Wilcoxon test from the viewpoint of asymptotic expansion of power function under contiguous alternatives.
Journal of Statistical Planning and Inference | 2000
Yumi Fujioka; Yoshihiko Maesono
Abstract This paper proposes a normalizing transformation which removes bias, skewness and kurtosis, simultaneously. Its convergence rate to the standard normal distribution is o(n−1). The transformation is polynomial and monotone. We consider a class of asymptotic U-statistics, which includes most of interesting statistics.
Annals of the Institute of Statistical Mathematics | 1998
Yoshihiko Maesono
In this paper we obtain asymptotic representations of several variance estimators of U-statistics and study their effects for studentizations via Edgeworth expansions. Jackknife, unbiased and Sens variance estimators are investigated up to the order op(n-1). Substituting these estimators to studentized U-statistics, the Edgeworth expansions with remainder term o(n-1) are established and inverting the expansions, the effects on confidence intervals are discussed theoretically. We also show that Hinkleys corrected jackknife variance estimator is asymptotically equivalent to the unbiased variance estimator up to the order op(n-1).
Communications in Statistics-theory and Methods | 1986
Hajime Yamato; Yoshihiko Maesono
For a class of distributions which are invariant under a group of transformations, we propose an estimator ot an estimable parameter. The estimator, which we call the invariant U-statistic, is the uniformly minimum variance unbiased estimator of the corresponding estimable parameter for the class of all continuous distributions which are invariant under the group of transformations.
Journal of The Royal Statistical Society Series B-statistical Methodology | 2000
Peter Hall; Yoshihiko Maesono
The operation of resampling from a bootstrap resample, encountered in applications of the double bootstrap, may be viewed as resampling directly from the sample but using probability weights that are proportional to the numbers of times that sample values appear in the resample. This suggests an approximate approach to double-bootstrap Monte Carlo simulation, where weighted bootstrap methods are used to circumvent much of the labour involved in compounded Monte Carlo approximation. In the case of distribution estimation or, equivalently, confidence interval calibration, the new method may be used to reduce the computational labour. Moreover, the method produces the same order of magnitude of coverage error for confidence intervals, or level error for hypothesis tests, as a full application of the double bootstrap.
Communications in Statistics-theory and Methods | 1998
Yoshihiko Maesono
Some statistics in common use take a form of a ratio of two statistics.In this paper, we will discuss asymptotic properties of the ratio statistic.We obtain an asymptotic representation of the ratio with remainder term o p(n -1) and a Edgeworth expansion with remainder term o(n -1/2) And as example, the asymptotic representation and the Edgeworth expansion of the jackknife skewness estimator for U-statistics are established and we discuss the biases of the skewness estimator theoretically.We also apply the result to an estimator of Pearson’s coefficient of variation and the sample correlation coefficient.
Journal of Statistical Planning and Inference | 1991
Yoshihiko Maesono
Abstract The rates of convergence to the normal distribution are investigated for U-statistics of degree two. We derive an inequality which gives a lower bound for the uniform distance between the distribution of a U-statistic and the standard normal distribution, and which is useful in the case where the distribution of U-statistic is symmetric around the origin. The proof is based on Steins method.
Communications in Statistics-theory and Methods | 2010
Yoshihiko Maesono
Some statistics in common use take the form of a ratio of two statistics, such as sample correlation coefficient, Pearsons coefficient of variation, cumulant estimators, and so on. In this article, using an asymptotic representation of the ratio statistics, we will obtain an Edgeworth expansion and a normalizing transformation with remainder term o(n −1/2). The Edgeworth expansion is based on a Studentized ratio statistic, which is studentized by a consistent variance estimator. Applying these results to the sample correlation coefficient, we obtain the normalizing transformation and an asymptotic confidence interval of the correlation coefficient without assuming specific underlying distribution. This normalizing transformation is an extension of the Fishers z-transformation.
Journal of Nonparametric Statistics | 1996
Yoshihiko Maesono
In this paper we discuss jackknife estimators of the variance and their corrections precisely. Shao-Wu [10] have studied a jackknife variance estimator which is based on a delete-d-original estimator. And they have proved the consistency of the delete-d jackknife variance estimator even if the original estimator is not smooth. Their results are especially useful for the jackknife estimation of the variance of the sample quantile. Whereas in the case of smooth original estimators, which include U-statistics, the delete-d jackknife variance estimator is at least as large as the delete-1 estimator which is the traditional jackknife variance estimator. Then the delete-d jackknife variance estimator has larger bias than the delete-1.