Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Serena Ng is active.

Publication


Featured researches published by Serena Ng.


Econometrica | 2001

Lag Length Selection and the Construction of Unit Root Tests with Good Size and Power

Serena Ng; Pierre Perron

It is widely known that when there are negative moving average errors, a high order augmented autoregression is necessary for unit root tests to have good size, but that information criteria such as the AIC and BIC tend to select a truncation lag that is very small. Furthermore, size distortions increase with the number of deterministic terms in the regression. We trace these problems to the fact that information criteria omit important biases induced by a low order augmented autoregression. We consider a class of Modified Information Criteria (MIC) which account for the fact that the bias in the sum of the autoregressive coefficients is highly dependent on the lag order k. Using a local asymptotic framework in which the root of an MA(1) process is local to -1, we show that the MIC allows for added dependence between k and the number of deterministic terms in the regression. Most importantly, the k selected by the recommended MAIC is such that both its level and rate of increase with the sample size are desirable for unit root tests in the local asymptotic framework, whereas the AIC, MBIC and especially the BIC are less attractive in at least one dimension. In monte-carlo experiments, the MAIC is found to yield huge size improvements to the DF(GLS) and the feasible point optimal P(t) test developed in Elliot, Rothenberg and Stock (1996). We also extend the M tests developed in Perron and Ng (1996) to allow for GLS detrending of the data. The M(GLS) tests are shown to have power functions that lie very close to the power envelope. In addition, we recommend using GLS detrended data to estimate the required autoregressive spectral density at frequency zero. This provides more efficient estimates on the one hand, and ensures that the estimate of the spectral density is invariant to the parameters of the deterministic trend function, a property not respected by the estimation procedure currently employed by several studies. The MAIC along with GLS detrended data yield a set of Mbar(GLS) tests with desirable size and power properties.


Journal of the American Statistical Association | 1995

Unit Root Tests in ARMA Models with Data-Dependent Methods for the Selection of the Truncation Lag

Serena Ng; Pierre Perron

Abstract We analyze the choice of the truncation lag in the context of the Said-Dickey test for the presence of a unit root in a general autoregressive moving average model. It is shown that a deterministic relationship between the truncation lag and the sample size is dominated by data-dependent rules that take sample information into account. In particular, we study data-dependent rules that are not constrained to satisfy the lower bound condition imposed by Said-Dickey. Akaikes information criterion falls into this category. The analytical properties of the truncation lag selected according to a class of information criteria are compared to those based on sequential testing for the significance of coefficients on additional lags. The asymptotic properties of the unit root test under various methods for selecting the truncation lag are analyzed, and simulations are used to show their distinctive behavior in finite samples. Our results favor methods based on sequential tests over those based on informat...


The Review of Economic Studies | 1996

Useful Modifications to some Unit Root Tests with Dependent Errors and their Local Asymptotic Properties

Pierre Perron; Serena Ng

Many unit root tests have distorted sizes when the root of the error process is close to the unit circle. This paper analyses the properties of the Phillips-Perron tests and some of their variants in the problematic parameter space. We use local asymptotic analyses to explain why the Phillips-Perron tests suffer from severe size distortions regardless of the choice of the spectral density estimator but that the modified statistics show dramatic improvements in size when used in conjunction with a particular formulation an autoregressive spectral density estimator. We explain why kernel based spectral density estimators aggravate the size problem in the Phillips-Perron tests and yield no size improvement to the modified statistics. The local asymptotic power of the modified statistics are also evaluated. These modified statistics are recommended as being useful in empirical work since they are free of the size problems which have plagued many unit root tests, and they retain respectable power.


Journal of Econometrics | 2006

Are more data always better for factor analysis

Jean Boivin; Serena Ng

Factors estimated from large macroeconomic panels are being used in an increasing number of applications. However, little is known about how the size and the composition of the data affect the factor estimates. In this paper, we question whether it is possible to use more series to extract the factors, and yet the resulting factors are less useful for forecasting, and the answer is yes. Such a problem tends to arise when the idiosyncratic errors are cross-correlated. It can also arise if forecasting power is provided by a factor that is dominant in a small dataset but is a dominated factor in a larger dataset. In a real time forecasting exercise, we find that factors extracted from as few as 40 pre-screened series often yield satisfactory or even better results than using all 147 series. Weighting the data by their properties when constructing the factors also lead to improved forecasts. Our simulation analysis is unique in that special attention is paid to cross-correlated idiosyncratic errors, and we also allow the factors to have stronger loadings on some groups of series than others. It thus allows us to better understand the properties of the principal components estimator in empirical applications.


Journal of Business & Economic Statistics | 2007

Determining the Number of Primitive Shocks in Factor Models

Jushan Bai; Serena Ng

A widely held but untested assumption underlying macroeconomic analysis is that the number of shocks driving economic fluctuations, q, is small. In this article we associate q with the number of dynamic factors in a large panel of data. We propose a methodology to determineq without having to estimate the dynamic factors. We first estimate a VAR in r static factors, where the factors are obtained by applying the method of principal components to a large panel of data, then compute the eigenvalues of the residual covariance or correlation matrix. We then test whether their eigenvalues satisfy an asymptotically shrinking bound that reflects sampling error. We apply the procedure to determine the number of primitive shocks in a large number of macroeconomic time series. An important aspect of the present analysis is to make precise the relationship between the dynamic factors and the static factors, which is a result of independent interest.


Journal of Business & Economic Statistics | 2005

Tests for Skewness, Kurtosis, and Normality for Time Series Data

Jushan Bai; Serena Ng

We present the sampling distributions for the coefficient of skewness, kurtosis, and a joint test of normality for time series observations. We show that when the data are serially correlated, consistent estimates of three-dimensional long-run covariance matrices are needed for testing symmetry or kurtosis. These tests can be used to make inference about any conjectured coefficients of skewness and kurtosis. In the special case of normality, a joint test for the skewness coefficient of 0 and a kurtosis coefficient of 3 can be obtained on construction of a four-dimensional long-run covariance matrix. The tests are developed for demeaned data, but the statistics have the same limiting distributions when applied to regression residuals. Monte Carlo simulations show that the test statistics for symmetry and normality have good finite-sample size and power. However, size distortions render testing for kurtosis almost meaningless except for distributions with thin tails, such as the normal distribution. Combining skewness and kurtosis is still a useful test of normality provided that the limiting variance accounts for the serial correlation in the data. The tests are applied to 21 macroeconomic time series.


Foundations and Trends in Econometrics | 2008

Large Dimensional Factor Analysis

Jushan Bai; Serena Ng

Econometric analysis of large dimensional factor models has been a heavily researched topic in recent years. This review surveys the main theoretical results that relate to static factor models or dynamic factor models that can be cast in a static framework. Among the topics covered are how to determine the number of factors, how to conduct inference when estimated factors are used in regressions, how to assess the adequacy of observed variables as proxies for latent factors, how to exploit the estimated factors to test unit root tests and common trends, and how to estimate panel cointegration models. The fundamental result that justifies these analyses is that the method of asymptotic principal components consistently estimates the true factor space. We use simulations to better understand the conditions that can affect the precision of the factor estimates.


Journal of Econometrics | 2000

Estimating the Rational Expectations Model of Speculative Storage: A Monte Carlo Comparison of Three Simulation Estimators

Alexander Michaelides; Serena Ng

The non-negativity constraint on inventories imposed on the rational expectations theory of speculative storage implies that the conditional mean and variance of commodity prices are non-linear in lagged prices and have a kink at a threshold point. In this paper, the structural parameters of this model are estimated using three simulation-based estimators. In a Monte Carlo experiment, the finite sample properties of the simulated methods of moments estimator of Duffie and Singleton (1993, Econometrica 61 (4), 929–952) the indirect inference estimator of Gourieroux et al. (1993, Journal of Applied Economterics 8, S85–S118) and the efficient method of moments estimator of Gallant and Tauchen (1996, Econometric Theory 12, 657–681) are assessed. Exploiting the invariant distribution implied by the theory allows us to evaluate the error induced by simulations. Our results show that the estimators differ in their sensitivity to the sample size, the number of simulations, choice of auxiliary models, and computation demands. For some estimators, the test for overidentifying restrictions exhibit significant size distortions in small samples. Overall, while the simulation estimators have small bias, they are less efficient than pseudo-maximum likelihood (PMLE). Hence for the small sample sizes considered, the simulation estimators are still inferior to the PMLE estimates in a mean-squared sense.


Econometric Theory | 2010

Instrumental Variable Estimation In A Data Rich Environment

Jushan Bai; Serena Ng

We consider estimation of parameters in a regression model with endogenous regressors. The endogenous regressors along with a large number of other endogenous variables are driven by a small number of unobservable exogenous common factors. We show that the estimated common factors can be used as instrumental variables and they are more efficient than the observed variables in our framework. Whereas standard optimal generalized method of moments estimator using a large number of instruments is biased and can be inconsistent, the factor instrumental variable estimator (FIV) is shown to be consistent and asymptotically normal, even if the number of instruments exceeds the sample size. Furthermore, FIV remains consistent even if the observed variables are invalid instruments as long as the unobserved common components are valid instruments. We also consider estimating panel data models in which all regressors are endogenous but share exogenous common factors. We show that valid instruments can be constructed from the endogenous regressors. Although single equation FIV requires no bias correction, the faster convergence rate of the panel estimator is such that a bias correction is necessary to obtain a zero-centered normal distribution.


National Bureau of Economic Research | 2009

A Factor Analysis of Bond Risk Premia

Sydney C. Ludvigson; Serena Ng

This paper uses the factor augmented regression framework to analyze the relation between bond excess returns and the macro economy. Using a panel of 131 monthly macroeconomic time series for the sample 1964:1-2007:12, we estimate 8 static factors by the method of asymptotic principal components. We also use Gibb sampling to estimate dynamic factors from the 131 series reorganized into 8 blocks. Regardless of how the factors are estimated, macroeconomic factors are found to have statistically significant predictive power for excess bond returns. We show how a bias correction to the parameter estimates of factor augmented regressions can be obtained. This bias is numerically trivial in our application. The predictive power of real activity for excess bond returns is robust even after accounting for finite sample inference problems. Forecasts of excess bond returns (or bond risk premia) are countercyclical. This implies that investors are compensated for risks associated with recessions.

Collaboration


Dive into the Serena Ng's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ivana Komunjer

University of California

View shared research outputs
Top Co-Authors

Avatar

Jean Boivin

National Bureau of Economic Research

View shared research outputs
Top Co-Authors

Avatar

Nikolay Gospodinov

Federal Reserve Bank of Atlanta

View shared research outputs
Top Co-Authors

Avatar

Yuriy Gorodnichenko

National Bureau of Economic Research

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge