Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Eric M. Aldrich is active.

Publication


Featured researches published by Eric M. Aldrich.


Journal of the American Statistical Association | 2006

Calibrated Probabilistic Forecasting at the Stateline Wind Energy Center: The Regime-Switching Space–Time Method

Tilmann Gneiting; Kristin Larson; Kenneth Westrick; Marc G. Genton; Eric M. Aldrich

With the global proliferation of wind power, the need for accurate short-term forecasts of wind resources at wind energy sites is becoming paramount. Regime-switching space–time (RST) models merge meteorological and statistical expertise to obtain accurate and calibrated, fully probabilistic forecasts of wind speed and wind power. The model formulation is parsimonious, yet takes into account all of the salient features of wind speed: alternating atmospheric regimes, temporal and spatial correlation, diurnal and seasonal nonstationarity, conditional heteroscedasticity, and non-Gaussianity. The RST method identifies forecast regimes at a wind energy site and fits a conditional predictive model for each regime. Geographically dispersed meteorological observations in the vicinity of the wind farm are used as off-site predictors. The RST technique was applied to 2-hour-ahead forecasts of hourly average wind speed near the Stateline wind energy center in the U. S. Pacific Northwest. The RST point forecasts and distributional forecasts were accurate, calibrated, and sharp, and they compared favorably with predictions based on state-of-the-art time series techniques. This suggests that quality meteorological data from sites upwind of wind farms can be efficiently used to improve short-term forecasts of wind resources.


Journal of Economic Dynamics and Control | 2011

Tapping the Supercomputer Under Your Desk: Solving Dynamic Equilibrium Models with Graphics Processors

Eric M. Aldrich; Jesús Fernández-Villaverde; A. Ronald Gallant; Juan Francisco Rubio-Ramirez

This paper shows how to build algorithms that use graphics processing units (GPUs) installed in most modern computers to solve dynamic equilibrium models in economics. In particular, we rely on the compute unified device architecture (CUDA) of NVIDIA GPUs. We illustrate the power of the approach by solving a simple real business cycle model with value function iteration. We document improvements in speed of around 200 times and suggest that even further gains are likely.


B E Journal of Economic Analysis & Policy | 2005

Do People Value Racial Diversity? Evidence from Nielsen Ratings

Eric M. Aldrich; Peter Arcidiacono; Jacob L. Vigdor

Abstract Nielsen ratings for ABCs Monday Night Football are significantly higher when the game involves a black quarterback. In this paper, we consider competing explanations for this effect. First, quarterback race might proxy for other player or team attributes. Second, black viewership patterns might be sensitive to quarterback race. Third, viewers of all races might be exhibiting a taste for diversity. We use both ratings data and evidence on racial attitudes from the General Social Survey to test these hypotheses empirically. The evidence strongly supports the taste-for-diversity hypothesis, while suggesting some role for black own-race preferences as well.


Journal of Financial Econometrics | 2011

Habit, Long Run Risks, Prospect? A Statistical Inquiry

Eric M. Aldrich; A. Ronald Gallant

We use recently proposed Bayesian statistical methods to compare the habit persistence asset pricing model of Campbell and Cochrane, the long-run risks model of Bansal and Yaron, and the prospect theory model of Barberis, Huang, and Santos. We improve these Bayesian methods so that they can accommodate highly nonlinear models such as the three aforementioned. Our substantive results can be stated succinctly: If one believes that the extreme consumption fluctuations of 1930–1949 can recur, although they have not in the last sixty years even counting the current recession, then the long-run risks model is preferred. Otherwise, the habit model is preferred. We reach this conclusion by undertaking two types of comparisons, relative and absolute, over two sample periods, 1930–2008 and 1950–2008, using real, annual, U.S. data on stock returns, consumption growth, and the price to dividend ratio. Comparisons are conducted using a trivariate series of all three, a bivariate series comprised of consumption growth and stock returns, and a univariate series of stock returns alone. The prior for each model is that the ergodic mean of the real interest rate be 0.896 within ±1 with probability 0.95 together with a preference for model parameters that are near their published values. The prospect theory model is not considered for the trivariate series because it puts all its mass on a two-dimensional subspace thereby violating the regularity conditions of the methods employed. For the trivariate series, in the relative comparison, the long-run risks model dominates the habit model over the 1930–2008 period, while the habit persistence model dominates the long-run risks model over the 1950–2008 period; in the absolute assessment, both models fail over both sample periods. Empirical results for the bivariate series are explored more completely because it has the most substantive relevance. For the bivariate series, in the relative comparison, the long-run risks model dominates over the 1930–2008 period, while the habit persistence model dominates over the 1950–2008 period; in the absolute assessment, the habit model fails in the 1930–2008 period and the prospect theory model fails in the 1950–2008 period. Out-of-sample, the models show interesting differences in their forecasts over the 2009–2013 horizon. In-sample, all three models track the conditional volatility of stock returns about the same. They differ mainly in how they track the conditional volatility of consumption growth and the conditional correlation between consumption growth and stock returns. For the univariate series and for both sample periods, the models perform about the same in the relative comparison and fit the series reasonably well in the absolute assessment. The main value of the univariate series is that the near equal performance of the three models permits exploration of methodological issues.


Handbook of Computational Economics | 2013

GPU Computing in Economics

Eric M. Aldrich

This paper discusses issues related to GPU for Economic problems. It highlights new methodologies and resources that are available for solving and estimating economic models and emphasizes situations when they are useful and others where they are impractical. Two examples illustrate the different ways these GPU parallel methods can be employed to speed computation.


Archive | 2017

Computational Methods for Production-Based Asset Pricing Models with Recursive Utility

Eric M. Aldrich; Howard Kung

We compare local and global polynomial solution methods for DSGE models with Epstein- Zin-Weil utility. We show that model implications for macroeconomic quantities are relatively invariant to choice of solution method but that a global method can yield substantial improve- ments for asset prices and welfare costs. The divergence in solution quality is highly dependent on parameters which effect value function sensitivity to TFP volatility, as well as the magnitude of TFP volatility itself. This problem is pronounced for calibrations at the extreme of those accepted in the asset pricing literature and disappears for more traditional macroeconomic parameterizations.


2012 Meeting Papers | 2013

Trading Volume in General Equilibrium with Complete Markets

Eric M. Aldrich

This paper investigates asset trade in a general-equilibrium complete-markets endowment economy with heterogeneous agents. It shows that standard no-trade results cease to hold when agents have heterogeneous beliefs and that substantial trade volume is generated, even in the presence of a spanning set of assets. Further, trade volume and price movements have a positive relationship in the model, as is well documented in the empirical literature. This paper also develops a computational algorithm for solving finite-period heterogeneous-beliefs economies and demonstrates how the problem is well suited for large-scale parallel computing methods, such as GPU computing.


Archive | 2005

Alternative Estimators of Wavelet Variance

Eric M. Aldrich

The wavelet variance is a scale-based decomposition of the process variance that is particularly well suited for analyzing intrinsically stationary processes. This decomposition has proven to be useful for studying various geophysical time series, including some related to sub-tidal sea level variations, vertical shear in the ocean and variations in soil composition along a transect. Previous work has established the large sample properties of an unbiased estimator of the wavelet variance formed using the non-boundary wavelet coefficients from the maximal overlap discrete wavelet transform (MODWT). The present work considers two alternative estimators, one of which is unbiased, and the other, biased. The new unbiased estimator is appropriate for asymmetric wavelet filters such as the Debaucheries filters of width four and higher and is obtained from the non-boundary coefficients that result from running a wavelet filter through a time series in both a forward and a backward direction. The biased estimator is constructed in a similar fashion, but utilizes all wavelet coefficients that result from filtering a time series in forward and backward directions. While the two alternative estimators have the same asymptotic distribution as the original unbiased estimator (with some restrictions in the case of the biased estimator), they can have substantially better statistical properties in small sample sizes. Formula for evaluating the mean squared errors of the usual unbiased estimator and the two alternative estimators are developed and verified for several fractionally differenced processes via Monte Carlo experiments.


Archive | 2018

Experiments in High-Frequency Trading: Testing the Frequent Batch Auction

Eric M. Aldrich; Kristian López Vargas

We implement a laboratory financial market where traders can access costly technology that reduces communication latency with a remote exchange. In this environment, we conduct a market design study on high-frequency trading: we contrast the performance of the newly proposed Frequent Batch Auction (FBA) against the Continuous Double Auction (CDA), which organizes trades in most exchanges worldwide. Our evidence suggests that, relative to the CDA, the FBA exhibits (1) less predatory trading behavior, (2) lower investments in low-latency communication technology, (3) lower transaction costs, and (4) lower volatility in market spreads and liquidity. We also find that transitory shocks in the environment have substantially greater impact on market dynamics in the CDA than in the FBA.Using laboratory experiments, we compare two leading financial market formats in the presence of high-frequency trading (HFT): the Continuous Double Auction (CDA), also known as the continuous limit order book, which organizes trade in the majority of equities, futures and currency exchanges around the world; and the Frequent Batch Auction (FBA), which gives equal time priority to orders received within a short batching period. Our evidence suggests that, relative to the CDA, the FBA (1) reduces predatory trading behavior, (2) disincentivizes investment in low-latency messaging technology, and (3) results in lower transaction costs. Further, volatility in minimum spreads and in liquidity is higher in CDA compared to the FBA. Finally, we examine transitory, off-equilibrium behavior. In the CDA, transitory changes in the environment affect market dynamics substantially more than in the FBA.


Archive | 2017

Order Protection through Delayed Messaging

Eric M. Aldrich; Daniel Friedman

Several financial exchanges have recently introduced messaging delays (e.g., a 350 microsecond delay at IEX and NYSE American) intended to protect ordinary investors from high-frequency traders who exploit stale orders. We propose an equilibrium model of this exchange design as a modification of the standard continuous double auction market format. The model predicts that a messaging delay will generally improve price efficiency and lower transactions cost but will increase queuing costs. Some of the predictions are testable in the field or in a laboratory environment.

Collaboration


Dive into the Eric M. Aldrich's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jesús Fernández-Villaverde

National Bureau of Economic Research

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge