Kohei Adachi
Osaka University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Kohei Adachi.
Vision Research | 2006
Atsuki Higashiyama; Kohei Adachi
We investigated, using three comparisons, perceived size and perceived distance of targets seen from between the legs. Five targets, varying from 32 to 163 cm in height, were presented at viewing distances of 2.5-45 m, and a total of 90 observers verbally judged the perceived size and perceived distance of each target. In comparison 1, 15 observers inverted their heads upside down and saw the targets between their own legs; another 15 observers viewed them while being erect on the ground. The results showed that inverting the head lowered the degree of size constancy and compressed the scale for distance. To examine whether these results were due to an inversion of retinal-image or body orientation, comparisons 2 and 3 were performed. In comparison 2, 15 observers stood upright and saw the targets with prism goggles that rotated the visual field 180 degrees , while other 15 observers stood upright, but viewed the targets with a hollow frame lacking the prisms. The results showed that, in both goggle conditions, size constancy prevailed and perceived distance was a linear function of physical distance. In comparison 3, 15 observers wore the 180 degrees rotation goggles and viewed the targets by bending their heads forwardly, and the other 15 observers viewed them while wearing hollow goggles and lying on the belly. The results showed a low degree of size constancy and compressed the scale for distance. Therefore, it is suggested that perceived size and perceived distance are affected by an inversion of body orientation, not of retinal image orientation. When path analysis and partial correlation analysis were applied to the whole data, perceived size was found to be independent of perceived distance. These results supported the direct perception model, rather than the apparent distance model.
Psychometrika | 2015
Nickolay T. Trendafilov; Kohei Adachi
The component loadings are interpreted by considering their magnitudes, which indicates how strongly each of the original variables relates to the corresponding principal component. The usual ad hoc practice in the interpretation process is to ignore the variables with small absolute loadings or set to zero loadings smaller than some threshold value. This, in fact, makes the component loadings sparse in an artificial and a subjective way. We propose a new alternative approach, which produces sparse loadings in an optimal way. The introduced approach is illustrated on two well-known data sets and compared to the existing rotation methods.
Psychometrika | 2013
Kohei Adachi
Rubin and Thayer (Psychometrika, 47:69–76, 1982) proposed the EM algorithm for exploratory and confirmatory maximum likelihood factor analysis. In this paper, we prove the following fact: the EM algorithm always gives a proper solution with positive unique variances and factor correlations with absolute values that do not exceed one, when the covariance matrix to be analyzed and the initial matrices including unique variances and inter-factor correlations are positive definite. We further numerically demonstrate that the EM algorithm yields proper solutions for the data which lead the prevailing gradient algorithms for factor analysis to produce improper solutions. The numerical studies also show that, in real computations with limited numerical precision, Rubin and Thayer’s (Psychometrika, 47:69–76, 1982) original formulas for confirmatory factor analysis can make factor correlation matrices asymmetric, so that the EM algorithm fails to converge. However, this problem can be overcome by using an EM algorithm in which the original formulas are replaced by those guaranteeing the symmetry of factor correlation matrices, or by formulas used to prove the above fact.
Archive | 2016
Kohei Adachi
This book enables readers who may not be familiar with matrices to understand a variety of multivariate analysis procedures in matrix forms. Another feature of the book is that it emphasizes what model underlies a procedure and what objective function is optimized for fitting the model to data. The author believes that the matrix-based learning of such models and objective functions is the fastest way to comprehend multivariate data analysis. The text is arranged so that readers can intuitively capture the purposes for which multivariate analysis procedures are utilized: plain explanations of the purposes with numerical examples precede mathematical descriptions in almost every chapter. This volume is appropriate for undergraduate students who already have studied introductory statistics. Graduate students and researchers who are not familiar with matrix-intensive formulations of multivariate data analysis will also find the book useful, as it is based on modern matrix formulations with a special emphasis on singular value decomposition among theorems in matrix algebra. The book begins with an explanation of fundamental matrix operations and the matrix expressions of elementary statistics, followed by the introduction of popular multivariate procedures with advancing levels of matrix algebra chapter by chapter. This organization of the book allows readers without knowledge of matrices to deepen their understanding of multivariate data analysis.
Psychometrika | 2018
Kohei Adachi; Nickolay T. Trendafilov
A new factor analysis (FA) procedure has recently been proposed which can be called matrix decomposition FA (MDFA). All FA model parameters (common and unique factors, loadings, and unique variances) are treated as fixed unknown matrices. Then, the MDFA model simply becomes a specific data matrix decomposition. The MDFA parameters are found by minimizing the discrepancy between the data and the MDFA model. Several algorithms have been developed and some properties have been discussed in the literature (notably by Stegeman in Comput Stat Data Anal 99:189–203, 2016), but, as a whole, MDFA has not been studied fully yet. A number of new properties are discovered in this paper, and some existing ones are derived more explicitly. The properties provided concern the uniqueness of results, covariances among common factors, unique factors, and residuals, and assessment of the degree of indeterminacy of common and unique factor scores. The properties are illustrated using a real data example.
Psychometrika | 2017
Nickolay T. Trendafilov; Sara Fontanella; Kohei Adachi
Sparse principal component analysis is a very active research area in the last decade. It produces component loadings with many zero entries which facilitates their interpretation and helps avoid redundant variables. The classic factor analysis is another popular dimension reduction technique which shares similar interpretation problems and could greatly benefit from sparse solutions. Unfortunately, there are very few works considering sparse versions of the classic factor analysis. Our goal is to contribute further in this direction. We revisit the most popular procedures for exploratory factor analysis, maximum likelihood and least squares. Sparse factor loadings are obtained for them by, first, adopting a special reparameterization and, second, by introducing additional
Computational Statistics & Data Analysis | 2016
Hiroki Ikemoto; Kohei Adachi
Advances in Latent Variables - Methods, Models and Applications | 2014
Kohei Adachi; Nickolay T. Trendafilov
\ell _1
Psychometrika | 2013
Hironori Satomura; Kohei Adachi
The Annual Meeting of the Psychometric Society | 2017
Naoto Yamashita; Kohei Adachi
ℓ1-norm penalties into the standard factor analysis problems. As a result, we propose sparse versions of the major factor analysis procedures. We illustrate the developed algorithms on well-known psychometric problems. Our sparse solutions are critically compared to ones obtained by other existing methods.