Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jeremy Staum is active.

Publication


Featured researches published by Jeremy Staum.


Operations Research | 2010

Stochastic Kriging for Simulation Metamodeling

Bruce E. Ankenman; Barry L. Nelson; Jeremy Staum

We extend the basic theory of kriging, as applied to the design and analysis of deterministic computer experiments, to the stochastic simulation setting. Our goal is to provide flexible, interpolation-based metamodels of simulation output performance measures as functions of the controllable design or decision variables. To accomplish this we characterize both the intrinsic uncertainty inherent in a stochastic simulation and the extrinsic uncertainty about the unknown response surface. We use tractable examples to demonstrate why it is critical to characterize both types of uncertainty, derive general results for experiment design and analysis, and present a numerical example that illustrates the stochastic kriging method.


winter simulation conference | 2008

Stochastic kriging for simulation metamodeling

Bruce E. Ankenman; Barry L. Nelson; Jeremy Staum

We extend the basic theory of kriging, as applied to the design and analysis of deterministic computer experiments, to the stochastic simulation setting. Our goal is to provide flexible, interpolation-based metamodels of simulation output performance measures as functions of the controllable design or decision variables. To accomplish this we characterize both the intrinsic uncertainty inherent in a stochastic simulation and the extrinsic uncertainty about the unknown response surface. We use tractable examples to demonstrate why it is critical to characterize both types of uncertainty, derive general results for experiment design and analysis, and present a numerical example that illustrates the stochastic kriging method.


winter simulation conference | 2009

Better simulation metamodeling: the why, what, and how of stochastic kriging

Jeremy Staum

Stochastic kriging is a methodology recently developed for metamodeling stochastic simulation. Stochastic kriging can partake of the behavior of kriging and of generalized least squares regression. This advanced tutorial explains regression, kriging, and stochastic kriging as metamodeling methodologies, emphasizing the consequences of misspecified models for global metamodeling. It provides an exposition of how to choose parameters in stochastic kriging and how to build a metamodel with it given simulation output, and discusses future research directions to enhance stochastic kriging.


Iie Transactions | 2016

Systemic Risk Components in a Network Model of Contagion

Jeremy Staum; Mingbin Feng; Ming Liu

ABSTRACT We show how to perform a systemic risk attribution in a network model of contagion with interlocking balance sheets, using the Shapley and Aumann–Shapley values. Along the way, we establish new results on the sensitivity analysis of the Eisenberg–Noe network model of contagion, featuring a Markov chain interpretation. We illustrate the design process for systemic risk attribution methods by developing several examples.


Journal of Risk | 2010

Stochastic Kriging for Efficient Nested Simulation of Expected Shortfall

Ming Liu; Jeremy Staum

We use stochastic kriging, a metamodeling technique, to speed up nested simulation of expected shortfall, a portfolio risk measure. Evaluating a risk measure of a portfolio that includes derivative securities may require nested Monte Carlo simulation. The outer level simulates financial scenarios and the inner level of simulation estimates the portfolio value given a scenario. Spatial metamodeling enables inference about portfolio values in a scenario based on inner-level simulation of nearby scenarios, reducing the required computational effort: it is not necessary to perform inner-level simulation in every scenario. Because expected shortfall involves the scenarios that entail the largest losses, our procedure adaptively allocates more computational effort to inner-level simulation of those scenarios, which also improves computational efficiency.


Quantitative Finance | 2012

Systemic Risk Components and Deposit Insurance Premia

Jeremy Staum

In light of recent events, there have been proposals to establish a theory of financial system risk management analogous to portfolio risk management. One important aspect of portfolio risk management is risk attribution, the process of decomposing a risk measure into components that are attributed to individual assets or activities. The theory of portfolio risk attribution has limited applicability to systemic risk because systems can have richer structure than portfolios. This article contributes to the theory of systemic risk attribution and illuminates the design process for systemic risk attribution by developing some schemes for attributing systemic risk in an application to deposit insurance.


Management Science | 2007

Simulation of Coherent Risk Measures Based on Generalized Scenarios

Vadim Lesnevski; Barry L. Nelson; Jeremy Staum

In financial risk management, coherent risk measures have been proposed as a way to avoid undesirable properties of measures such as value at risk that discourage diversification and do not account for the magnitude of the largest, and therefore most serious, losses. A coherent risk measure equals the maximum expected loss under several different probability measures, and these measures are analogous to “populations” or “systems” in the ranking-and-selection literature. However, unlike in ranking and selection, here it is the value of the maximum expectation under any of the probability measures, and not the identity of the probability measure that attains it, that is of interest. We propose procedures to form fixed-width, simulation-based confidence intervals for the maximum of several expectations, explore their correctness and computational efficiency, and illustrate them on risk-management problems. The availability of efficient algorithms for computing coherent risk measures will encourage their use for improved risk management.


winter simulation conference | 2001

Simulation in financial engineering

Jeremy Staum

This paper presents an overview of the use of simulation algorithms in the field of financial engineering, assuming on the part of the reader no familiarity with finance and a modest familiarity with simulation methodology, but not its specialist research literature. The focus is on the challenges specific to financial simulations and the approaches that researchers have developed to handle them, although the paper does not constitute a comprehensive survey of the research literature. It offers to simulation researchers, professionals, and students an introduction to an application of increasing significance both within the simulation research community and among financial engineering practitioners.


Operations Research | 2010

A Confidence Interval Procedure for Expected Shortfall Risk Measurement via Two-Level Simulation

Hai Lan; Barry L. Nelson; Jeremy Staum

We develop and evaluate a two-level simulation procedure that produces a confidence interval for expected shortfall. The outer level of simulation generates financial scenarios, whereas the inner level estimates expected loss conditional on each scenario. Our procedure uses the statistical theory of empirical likelihood to construct a confidence interval. It also uses tools from the ranking-and-selection literature to make the simulation efficient.


Operations Research | 2011

Efficient Nested Simulation for Estimating the Variance of a Conditional Expectation

Yunpeng Sun; Daniel W. Apley; Jeremy Staum

In a two-level nested simulation, an outer level of simulation samples scenarios, while the inner level uses simulation to estimate a conditional expectation given the scenario. Applications include financial risk management, assessing the effects of simulation input uncertainty, and computing the expected value of gathering more information in decision theory. We show that an ANOVA-like estimator of the variance of the conditional expectation is unbiased under mild conditions, and we discuss the optimal number of inner-level samples to minimize this estimators variance given a fixed computational budget. We show that as the computational budget increases, the optimal number of inner-level samples remains bounded. This finding contrasts with previous work on two-level simulation problems in which the inner-and outer-level sample sizes must both grow without bound for the estimation error to approach zero. The finding implies that the variance of a conditional expectation can be estimated to arbitrarily high precision by a simulation experiment with a fixed inner-level computational effort per scenario, which we call a one-and-a-half-level simulation. Because the optimal number of inner-level samples is often quite small, a one-and-a-half-level simulation can avoid the heavy computational burden typically associated with two-level simulation.

Collaboration


Dive into the Jeremy Staum's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ming Liu

Northwestern University

View shared research outputs
Top Co-Authors

Avatar

Mingbin Feng

Northwestern University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Peter Salemi

Northwestern University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Hai Lan

Shanghai Jiao Tong University

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge