Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Hiroshi Kurata is active.

Publication


Featured researches published by Hiroshi Kurata.


Archive | 2004

Generalized Least Squares

Takeaki Kariya; Hiroshi Kurata

Preface.1 Preliminaries.1.1 Overview.1.2 Multivariate Normal and Wishart Distributions.1.3 Elliptically Symmetric Distributions.1.4 Group Invariance.1.5 Problems.2 Generalized Least Squares Estimators.2.1 Overview.2.2 General Linear Regression Model.2.3 Generalized Least Squares Estimators.2.4 Finiteness of Moments and Typical GLSEs.2.5 Empirical Example: CO2 Emission Data.2.6 Empirical Example: Bond Price Data.2.7 Problems.3 Nonlinear Versions of the Gauss-Markov Theorem.3.1 Overview.3.2 Generalized Least Squares Predictors.3.3 A Nonlinear Version of the Gauss-Markov Theorem in Prediction.3.4 A Nonlinear Version of the Gauss-Markov Theorem in Estimation.3.5 An Application to GLSEs with Iterated Residuals.3.6 Problems.4 SUR and Heteroscedastic Models.4.1 Overview.4.2 GLSEs with a Simple Covariance Structure.4.3 Upper Bound for the Covariance Matrix of a GLSE.4.4 Upper Bound Problem for the UZE in an SUR Model.4.5 Upper Bound Problems for a GLSE in a Heteroscedastic Model.4.6 Empirical Example: CO2 Emission Data.4.7 Problems.5 Serial Correlation Model.5.1 Overview.5.2 Upper Bound for the Risk Matrix of a GLSE.5.3 Upper Bound Problem for a GLSE in the Anderson Model.5.4 Upper Bound Problem for a GLSE in a Two-equation Heteroscedastic Model.5.5 Empirical Example: Automobile Data.5.6 Problems.6 Normal Approximation.6.1 Overview.6.2 Uniform Bounds for Normal Approximations to the Probability Density Functions.6.3 Uniform Bounds for Normal Approximations to the Cumulative Distribution Functions.6.4 Problems.7 Extension of Gauss-Markov Theorem.7.1 Overview.7.2 An Equivalence Relation on S(n).7.3 A Maximal Extension of the Gauss-Markov Theorem.7.4 Nonlinear Versions of the Gauss-Markov Theorem.7.5 Problems.8 Some Further Extensions.8.1 Overview.8.2 Concentration Inequalities for the Gauss-Markov Estimator.8.3 Efficiency of GLSEs under Elliptical Symmetry.8.4 Degeneracy of the Distributions of GLSEs.8.5 Problems.9 Growth Curve Model and GLSEs.9.1 Overview.9.2 Condition for the Identical Equality between the GME and the OLSE.9.3 GLSEs and Nonlinear Version of the Gauss-Markov Theorem .9.4 Analysis Based on a Canonical Form.9.5 Efficiency of GLSEs.9.6 Problems.A. Appendix.A.1 Asymptotic Equivalence of the Estimators of theta in the AR(1) Error Model and Anderson Model.Bibliography.Index.


Journal of Multivariate Analysis | 2011

Principal points of a multivariate mixture distribution

Shun Matsuura; Hiroshi Kurata

A set of n-principal points of a distribution is defined as a set of n points that optimally represent the distribution in terms of mean squared distance. It provides an optimal n-point-approximation of the distribution. However, it is in general difficult to find a set of principal points of a multivariate distribution. Tarpey et al. [T. Tarpey, L. Li, B. Flury, Principal points and self-consistent points of elliptical distributions, Ann. Statist. 23 (1995) 103-112] established a theorem which states that any set of n-principal points of an elliptically symmetric distribution is in the linear subspace spanned by some principal eigenvectors of the covariance matrix. This theorem, called a principal subspace theorem, is a strong tool for the calculation of principal points. In practice, we often come across distributions consisting of several subgroups. Hence it is of interest to know whether the principal subspace theorem remains valid even under such complex distributions. In this paper, we define a multivariate location mixture model. A theorem is established that clarifies a linear subspace in which n-principal points exist.


Communications in Statistics-theory and Methods | 2013

Definition and Properties of m-Dimensional n-Principal Points

Shun Matsuura; Hiroshi Kurata

In this article, we introduce the notion of “m-dimensional n-principal points,” which is a generalization of the notion of n-principal points. A set of m-dimensional n-principal points of a distribution is defined as a set of n points that optimally represents the distribution in terms of mean squared distance subject to the condition that the dimension of the linear subspace spanned by the n points is at most m. Its properties and connections to principal components are investigated for elliptically symmetric distributions and a location mixture of spherically symmetric distributions.


Communications in Statistics-theory and Methods | 2011

Linear Subspace Spanned by Principal Points of a Mixture of Spherically Symmetric Distributions

Hiroshi Kurata; Dingxi Qiu

For each positive integer k, a set of k-principal points of a distribution is the set of k points that optimally represent the distribution in terms of mean squared distance. However, explicit form of k-principal points is often difficult to obtain. Hence a theorem established by Tarpey et al. (1995) has been influential in the literature, which states that when the distribution is elliptically symmetric, any set of k-principal points is in the linear subspace spanned by some principal eigenvectors of the covariance matrix. This theorem is called a “principal subspace theorem”. Recently, Yamamoto and Shinozaki (2000b) derived a principal subspace theorem for 2-principal points of a location mixture of spherically symmetric distributions. In their article, the ratio of mixture was set to be equal. This article derives a further result by considering a location mixture with unequal mixture ratio.


Electronic Notes in Discrete Mathematics | 2008

Determining the minimum rank of matroids whose basis graph is common

Masahiro Hachimori; Hiroshi Kurata; Tadashi Sakuma

Abstract A graph G is called a matroid basis graph if it is isomorphic to a simple undirected graph whose vertices are the bases of some matroid and its two distinct vertices are adjacent if and only if the corresponding bases can be transformed into each other by a single-element exchange. Let r m i n ( G ) denote the minimum rank of matroids whose matriod basis graph is G in common. In this note, we show a formula which express this value r m i n in terms of the distance matrix of G. By using it, we obtain an O ( n 3 ) -time algorithm to determine r m i n , where n = | V ( G ) | , the number of bases in its corresponding matroid.


Journal of Statistical Planning and Inference | 2008

On principal points for location mixtures of spherically symmetric distributions

Hiroshi Kurata


Linear Algebra and its Applications | 2010

Multispherical Euclidean distance matrices

Hiroshi Kurata; Pablo Tarazaga


Linear Algebra and its Applications | 2012

Majorization for the eigenvalues of Euclidean distance matrices

Hiroshi Kurata; Pablo Tarazaga


Journal of Multivariate Analysis | 2008

Allometric extension model for conditional distributions

Hiroshi Kurata; Takahiro Hoshino; Yasunori Fujikoshi


Statistical Papers | 2010

A theorem on the covariance matrix of a generalized least squares estimator under an elliptically symmetric error

Hiroshi Kurata

Collaboration


Dive into the Hiroshi Kurata's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge