Jeremy Berkowitz
University of Houston
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Jeremy Berkowitz.
Econometric Reviews | 2000
Jeremy Berkowitz; Lutz Kilian
In recent years, several new parametric and nonparametric bootstrap methods have been proposed for time series data. Which of these methods should applied researchers use? We provide evidence that for many applications in time series econometrics parametric methods are more accurate, and we identify directions for future research on improving nonparametric methods. We explicitly address the important, but often neglected issue of model selection in bootstrapping. In particular, we emphasize the advantages of the AIC over other lag order selection criteria and the need to account for lag order uncertainty in resampling. We also show that the block size plays an important role in determining the success of the block bootstrap, and we propose a data-based block size selection procedure.(This abstract was borrowed from another version of this item.)
The Review of Economics and Statistics | 1997
Jeremy Berkowitz; Lorenzo Giorgianni
Several authors have recently investigated the predictability of exchange rates by fitting a sequence of long-horizon error-correction equations. We show by means of a simulation study that, in small to medium samples, inference from this regression procedure depends on the null hypothesis that is used to generate empirical critical values. The standard assumption of a stationary error-correction term between exchange rates and fundamentals biases the results in favor of predictive power. Our results show that evidence of long-horizon predictability weakens when using empirical critical values generated under the more stringent null of no cointegration. Likewise, results are weakened using critical values generated under the null that exchange rates and fundamentals are generated by an unrestricted VAR with no integration restrictions.
The Review of Economic Studies | 1998
Francis X. Diebold; Lee E. Ohanian; Jeremy Berkowitz
Many recent theoretical papers have come under attack for modeling prices as Geometric Brownian Motion. This process can diverge over time, implying that firms facing this price process can earn infinite profits. We explore the significance of this attack and contrast investment under Geometric Brownian Motion with investment assuming mean reversion. While analytically more complex, mean reversion in many cases is a more plausible assumption, allowing for supply responses to increasing prices. We show a mean reversion process rather than Geometric Brownian Motion and provide an explanation for this result.
The Journal of Law and Economics | 1999
Jeremy Berkowitz; Richard M. Hynes
The recent explosion in personal bankruptcy filings has motivated research into whether credit markets are being adversely affected by generous legal provisions. Empirically, this question is examined by comparing credit conditions and bankruptcy exemptions across states. We note that the literature has focused on aggregate household credit, making no distinction between secured and unsecured credit. We argue that such aggregation obscures important differences in forms of credit. Most significantly, property exemptions do not prevent the home mortgage lender from foreclosing on the home if not fully repaid.
The Review of Economics and Statistics | 1998
Jeremy Berkowitz; Francis X. Diebold
We generalize the Franke-Hrdle (1992) spectral-density bootstrap to the multivariate case. The extension is nontrivial and facilitates use of the Franke-Hrdle bootstrap in frequency-domain econometric work, which often centers on crossvariable dynamic interactions. We document the bootstraps good finite-sample performance in a small Monte Carlo experiment, and we conclude by highlighting key directions for future research.
Social Science Research Network | 1999
Jeremy Berkowitz
The forecast evaluation literature has traditionally focused on methods for assessing point-forecasts. However, in the context of risk models, interest centers on more than just a single point of the forecast distribution. For example, value-at-risk (VaR) models which are currently in extremely wide use form interval forecasts. Many other important financial calculations also involve estimates not summarized by a point-forecast. Although some techniques are currently available for assessing interval and density forecasts, none are suitable for sample sizes typically available. This paper suggests an new approach to evaluating such forecasts. It requires evaluation of the entire forecast distribution, rather than a value-at-risk quantity. The information content of forecast distributions combined with ex post loss realizations is enough to construct a powerful test even with sample sizes as small as 100.
Social Science Research Network | 1999
Jeremy Berkowitz; Ionel Birgean; Lutz Kilian
In recent years, there has been increasing interest in nonparametric bootstrap inference for economic time series. Nonparametric resampling techniques help protect against overly optimistic inference in time series models of unknown structure. They are particularly useful for evaluating the fit of dynamic economic models in terms of their spectra, impulse responses, and related statistics, because they do not require a correctly specified economic model. Notwithstanding the potential advantages of nonparametric bootstrap methods, their reliability in small samples is questionable. In this paper, we provide a benchmark for the relative accuracy of several nonparametric resampling algorithms based on ARMA representations of four macroeconomic time series. For each algorithm, we evaluate the effective coverage accuracy of impulse response and spectral density bootstrap confidence intervals for standard sample sizes. We find that the autoregressive sieve approach based on the encompassing model is most accurate. However, care must be exercised in selecting the lag order of the autoregressive approximation.
The Journal of Fixed Income | 1999
Jeremy Berkowitz
The values of interest rate swaps and many other financial assets are functions of rates or prices determined in over–the–counter, interbank, or other off–exchange markets. Settlement contracts rely on rates routinely collected through dealer polling. Many standard contracts use a technique knows as trimmed means to guard against misreporting, whether unintentional or for market manipulation. The author examines this and other robust valuation procedures. Simulation indicates that other procedures may better guard against the worst case scenario arising from a false report.
Social Science Research Network | 1998
Jeremy Berkowitz
The value of a vast array of financial assets are functions of rates or prices determined in OTC, interbank, or other off-exchange markets. In order to price such derivative assets, underlying rate and price indexes are routinely sampled and estimated. To guard against misreporting, whether unintentional or for market manipulation, many standard contracts utilize a technique known as trimmed-means. This paper points out that this polling problem falls within the statistical framework of robust estimation. Intuitive criteria for choosing among robust valuation procedures are discussed. In particular, the approach taken is to minimize the worst-case scenario arising from a false report. The finite sample performance of the procedures that qualify, the trimmed-mean and the Huber-estimator, are examined in a set of simulation experiments.
Journal of Finance | 2002
Jeremy Berkowitz; James M. O'Brien