Featured Researches

Econometrics

A Design-Based Perspective on Synthetic Control Methods

Since their introduction in Abadie and Gardeazabal (2003), Synthetic Control (SC) methods have quickly become one of the leading methods for estimating causal effects in observational studies with panel data. Formal discussions often motivate SC methods by the assumption that the potential outcomes were generated by a factor model. Here we study SC methods from a design-based perspective, assuming a model for the selection of the treated unit(s), e.g., random selection as guaranteed in a randomized experiment. We show that SC methods offer benefits even in settings with randomized assignment, and that the design perspective offers new insights into SC methods for observational data. A first insight is that the standard SC estimator is not unbiased under random assignment. We propose a simple modification of the SC estimator that guarantees unbiasedness in this setting and derive its exact, randomization-based, finite sample variance. We also propose an unbiased estimator for this variance. We show in settings with real data that under random assignment this Modified Unbiased Synthetic Control (MUSC) estimator can have a root mean-squared error (RMSE) that is substantially lower than that of the difference-in-means estimator. We show that such an improvement is weakly guaranteed if the treated period is similar to the other periods, for example, if the treated period was randomly selected. The improvement is most likely to be substantial if the number of pre-treatment periods is large relative to the number of control units.

Read more
Econometrics

A Distance Covariance-based Estimator

Weak instruments present a major setback to empirical work. This paper introduces an estimator that admits weak, uncorrelated, or mean-independent instruments that are non-independent of endogenous covariates. Relative to conventional instrumental variable methods, the proposed estimator weakens the relevance condition considerably without imposing a stronger exclusion restriction. Identification mainly rests on (1) a weak conditional median exclusion restriction imposed on pairwise differences in disturbances and (2) non-independence between covariates and instruments. Under mild conditions, the estimator is consistent and asymptotically normal. Monte Carlo experiments showcase an excellent performance of the estimator, and two empirical examples illustrate its practical utility.

Read more
Econometrics

A Doubly Corrected Robust Variance Estimator for Linear GMM

We propose a new finite sample corrected variance estimator for the linear generalized method of moments (GMM) including the one-step, two-step, and iterated estimators. Our formula additionally corrects for the over-identification bias in variance estimation on top of the commonly used finite sample correction of Windmeijer (2005) which corrects for the bias from estimating the efficient weight matrix, so is doubly corrected. An important feature of the proposed double correction is that it automatically provides robustness to misspecification of the moment condition. In contrast, the conventional variance estimator and the Windmeijer correction are inconsistent under misspecification. That is, the proposed double correction formula provides a convenient way to obtain improved inference under correct specification and robustness against misspecification at the same time.

Read more
Econometrics

A Flexible Stochastic Conditional Duration Model

We introduce a new stochastic duration model for transaction times in asset markets. We argue that widely accepted rules for aggregating seemingly related trades mislead inference pertaining to durations between unrelated trades: while any two trades executed in the same second are probably related, it is extremely unlikely that all such pairs of trades are, in a typical sample. By placing uncertainty about which trades are related within our model, we improve inference for the distribution of durations between unrelated trades, especially near zero. We introduce a normalized conditional distribution for durations between unrelated trades that is both flexible and amenable to shrinkage towards an exponential distribution, which we argue is an appropriate first-order model. Thanks to highly efficient draws of state variables, numerical efficiency of posterior simulation is much higher than in previous studies. In an empirical application, we find that the conditional hazard function for durations between unrelated trades varies much less than what most studies find. We claim that this is because we avoid statistical artifacts that arise from deterministic trade-aggregation rules and unsuitable parametric distributions.

Read more
Econometrics

A Framework for Eliciting, Incorporating, and Disciplining Identification Beliefs in Linear Models

To estimate causal effects from observational data, an applied researcher must impose beliefs. The instrumental variables exclusion restriction, for example, represents the belief that the instrument has no direct effect on the outcome of interest. Yet beliefs about instrument validity do not exist in isolation. Applied researchers often discuss the likely direction of selection and the potential for measurement error in their articles but lack formal tools for incorporating this information into their analyses. Failing to use all relevant information not only leaves money on the table; it runs the risk of leading to a contradiction in which one holds mutually incompatible beliefs about the problem at hand. To address these issues, we first characterize the joint restrictions relating instrument invalidity, treatment endogeneity, and non-differential measurement error in a workhorse linear model, showing how beliefs over these three dimensions are mutually constrained by each other and the data. Using this information, we propose a Bayesian framework to help researchers elicit their beliefs, incorporate them into estimation, and ensure their mutual coherence. We conclude by illustrating our framework in a number of examples drawn from the empirical microeconomics literature.

Read more
Econometrics

A Generalized Focused Information Criterion for GMM

This paper proposes a criterion for simultaneous GMM model and moment selection: the generalized focused information criterion (GFIC). Rather than attempting to identify the "true" specification, the GFIC chooses from a set of potentially mis-specified moment conditions and parameter restrictions to minimize the mean-squared error (MSE) of a user-specified target parameter. The intent of the GFIC is to formalize a situation common in applied practice. An applied researcher begins with a set of fairly weak "baseline" assumptions, assumed to be correct, and must decide whether to impose any of a number of stronger, more controversial "suspect" assumptions that yield parameter restrictions, additional moment conditions, or both. Provided that the baseline assumptions identify the model, we show how to construct an asymptotically unbiased estimator of the asymptotic MSE to select over these suspect assumptions: the GFIC. We go on to provide results for post-selection inference and model averaging that can be applied both to the GFIC and various alternative selection criteria. To illustrate how our criterion can be used in practice, we specialize the GFIC to the problem of selecting over exogeneity assumptions and lag lengths in a dynamic panel model, and show that it performs well in simulations. We conclude by applying the GFIC to a dynamic panel data model for the price elasticity of cigarette demand.

Read more
Econometrics

A Machine Learning Approach for Flagging Incomplete Bid-rigging Cartels

We propose a new method for flagging bid rigging, which is particularly useful for detecting incomplete bid-rigging cartels. Our approach combines screens, i.e. statistics derived from the distribution of bids in a tender, with machine learning to predict the probability of collusion. As a methodological innovation, we calculate such screens for all possible subgroups of three or four bids within a tender and use summary statistics like the mean, median, maximum, and minimum of each screen as predictors in the machine learning algorithm. This approach tackles the issue that competitive bids in incomplete cartels distort the statistical signals produced by bid rigging. We demonstrate that our algorithm outperforms previously suggested methods in applications to incomplete cartels based on empirical data from Switzerland.

Read more
Econometrics

A Model of the Fed's View on Inflation

We develop a medium-size semi-structural time series model of inflation dynamics that is consistent with the view - often expressed by central banks - that three components are important: a trend anchored by long-run expectations, a Phillips curve and temporary fluctuations in energy prices. We find that a stable long-term inflation trend and a well identified steep Phillips curve are consistent with the data, but they imply potential output declining since the new millennium and energy prices affecting headline inflation not only via the Phillips curve but also via an independent expectational channel. A high-frequency energy price cycle can be related to global factors affecting the commodity market, and often overpowers the Phillips curve thereby explaining the inflation puzzles of the last ten years.

Read more
Econometrics

A More Robust t-Test

Standard inference about a scalar parameter estimated via GMM amounts to applying a t-test to a particular set of observations. If the number of observations is not very large, then moderately heavy tails can lead to poor behavior of the t-test. This is a particular problem under clustering, since the number of observations then corresponds to the number of clusters, and heterogeneity in cluster sizes induces a form of heavy tails. This paper combines extreme value theory for the smallest and largest observations with a normal approximation for the average of the remaining observations to construct a more robust alternative to the t-test. The new test is found to control size much more successfully in small samples compared to existing methods. Analytical results in the canonical inference for the mean problem demonstrate that the new test provides a refinement over the full sample t-test under more than two but less than three moments, while the bootstrapped t-test does not.

Read more
Econometrics

A Multivariate Realized GARCH Model

We propose a novel class of multivariate GARCH models that utilize realized measures of volatilities and correlations. The central component is an unconstrained vector parametrization of the correlation matrix that facilitates modeling of the correlation structure. The parametrization is based on the matrix logarithmic transformation that retains the positive definiteness as an innate property. A factor approach offers a way to impose a parsimonious structure in high dimensional system and we show that a factor framework arises naturally in some existing models. We apply the model to returns of nine assets and employ the factor structure that emerges from a block correlation specification. An auxiliary empirical finding is that the empirical distribution of parametrized realized correlations is approximately Gaussian. This observation is analogous to the well-known result for logarithmically transformed realized variances.

Read more

Ready to get started?

Join us today