Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Mohsen Pourahmadi is active.

Publication


Featured researches published by Mohsen Pourahmadi.


Test | 2001

Parametric modelling of growth curve data: An overview

Dale L. Zimmerman; Vicente Núñez-Antón; Timothy G. Gregoire; Oliver Schabenberger; Jeffrey D. Hart; Michael G. Kenward; Geert Molenberghs; Geert Verbeke; Mohsen Pourahmadi; Philippe Vieu; Dela L. Zimmerman

In the past two decades a parametric multivariate regression modelling approach for analyzing growth curve data has achieved prominence. The approach, which has several advantages over classical analysis-of-variance and general multivariate approaches, consists of postulating, fitting, evaluating, and comparing parametric models for the datas mean structure and covariance structure. This article provides an overview of the approach, using unified terminology and notation. Well-established models and some developed more recently are described, with emphasis given to those models that allow for nonstationarity and for measurement times that differ across subjects and are unequally spaced. Graphical diagnostics that can assist with model postulation and evaluation are discussed, as are more formal methods for fitting and comparing models. Three examples serve to illustrate the methodology and to reveal the relative strengths and weaknesses of the various parametric models.


Statistical Science | 2011

Covariance Estimation: The GLM and Regularization Perspectives

Mohsen Pourahmadi

Finding an unconstrained and statistically interpretable reparameterization of a covariance matrix is still an open problem in statistics. Its solution is of central importance in covariance estimation, particularly in the recent high-dimensional data environment where enforcing the positive-definiteness constraint could be computationally expensive. We provide a survey of the progress made in modeling covariance matrices from two relatively complementary perspectives: (1) generalized linear models (GLM) or parsimony and use of covariates in low dimensions, and (2) regularization or sparsity for high-dimensional data. An emerging, unifying and powerful trend in both perspectives is that of reducing a covariance estimation problem to that of estimating a sequence of regression problems. We point out several instances of the regression-based formulation. A notable case is in sparse estimation of a precision matrix or a Gaussian graphical model leading to the fast graphical LASSO algorithm. Some advantages and limitations of the regression-based Cholesky decomposition relative to the classical spectral (eigenvalue) and variance-correlation decompositions are highlighted. The former provides an unconstrained and statistically interpretable reparameterization, and guarantees the positive-definiteness of the estimated covariance matrix. It reduces the unintuitive task of covariance estimation to that of modeling a sequence of regressions at the cost of imposing an a priori order among the variables. Elementwise regularization of the sample covariance matrix such as banding, tapering and thresholding has desirable asymptotic properties and the sparse estimated covariance matrix is positive definite with probability tending to one for large samples and dimensions.


Journal of Multivariate Analysis | 2009

Modeling covariance matrices via partial autocorrelations

Michael J. Daniels; Mohsen Pourahmadi

We study the role of partial autocorrelations in the reparameterization and parsimonious modeling of a covariance matrix. The work is motivated by and tries to mimic the phenomenal success of the partial autocorrelations function (PACF) in model formulation, removing the positive-definiteness constraint on the autocorrelation function of a stationary time series and in reparameterizing the stationarity-invertibility domain of ARMA models. It turns out that once an order is fixed among the variables of a general random vector, then the above properties continue to hold and follows from establishing a one-to-one correspondence between a correlation matrix and its associated matrix of partial autocorrelations. Connections between the latter and the parameters of the modified Cholesky decomposition of a covariance matrix are discussed. Graphical tools similar to partial correlograms for model formulation and various priors based on the partial autocorrelations are proposed. We develop frequentist/Bayesian procedures for modelling correlation matrices, illustrate them using a real dataset, and explore their properties via simulations.


Siam Journal on Applied Mathematics | 1984

On Minimality and Interpolation of Harmonizable Stable Processes

Mohsen Pourahmadi

It is shown that a harmonizable symmetric


Probability Theory and Related Fields | 1993

Baxter's inequality and convergence of finite predictors of multivariate stochastic processess

R. Cheng; Mohsen Pourahmadi

\alpha


Probability Theory and Related Fields | 1988

Wold decomposition, prediction and parameterization of stationary processes with infinite variance

A. G. Miamee; Mohsen Pourahmadi

-stable process, i.e. the Fourier coefficients of a process with independent symmetric


Journal of Multivariate Analysis | 1985

A matricial extension of the Helson-Szegö theorem and its application in multivariate prediction

Mohsen Pourahmadi

\alpha


Journal of Time Series Analysis | 2000

Prediction Variance and Information Worth of Observations in Time Series

Mohsen Pourahmadi; E. S. Soofi

-stable increments, with spectral density w is minimal, if and only if


Journal of Multivariate Analysis | 1989

On the convergence of finite linear predictors of stationary processes

Mohsen Pourahmadi

w > 0


Communications in Statistics-theory and Methods | 2007

Skew-Normal ARMA Models with Nonlinear Heteroscedastic Predictors

Mohsen Pourahmadi

a.e. and

Collaboration


Dive into the Mohsen Pourahmadi's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Michael J. Daniels

University of Texas at Austin

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge