Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Matthew C. Gerstenberger is active.

Publication


Featured researches published by Matthew C. Gerstenberger.


Bulletin of the Seismological Society of America | 2012

National Seismic Hazard Model for New Zealand: 2010 Update

Mark W. Stirling; Graeme H. McVerry; Matthew C. Gerstenberger; Nicola Litchfield; Russ Van Dissen; Kelvin Berryman; Philip M. Barnes; Laura M. Wallace; Pilar Villamor; Robert Langridge; Geoffroy Lamarche; Scott D. Nodder; Martin Reyners; Brendon A. Bradley; David A. Rhoades; Warwick Smith; A. Nicol; Jarg R. Pettinga; Kate Clark; Katrina Jacobs

A team of earthquake geologists, seismologists, and engineering seis- mologists has collectively produced an update of the national probabilistic seismic hazard (PSH) model for New Zealand (National Seismic Hazard Model, or NSHM). The new NSHM supersedes the earlier NSHM published in 2002 and used as the hazard basis for the New Zealand Loadings Standard and numerous other end-user applica- tions. The new NSHM incorporates a fault source model that has been updated with over 200 new onshore and offshore fault sources and utilizes new New Zealand-based and international scaling relationships for the parameterization of the faults. The dis- tributed seismicity model has also been updated to include post-1997 seismicity data, a new seismicity regionalization, and improved methodology for calculation of the seismicity parameters. Probabilistic seismic hazard maps produced from the new NSHM show a similar pattern of hazard to the earlier model at the national scale, but there are some significant reductions and increases in hazard at the regional scale. The national-scale differences between the new and earlier NSHM appear less than those seen between much earlier national models, indicating that some degree of consis- tency has been achieved in the national-scale pattern of hazard estimates, at least for return periods of 475 years and greater. Online Material: Table of fault source parameters for the 2010 national seismic- hazard model.


Bulletin of the Seismological Society of America | 2010

Likelihood-Based Tests for Evaluating Space-Rate-Magnitude Earthquake Forecasts

J. Douglas Zechar; Matthew C. Gerstenberger; David A. Rhoades

Abstract The five-year experiment of the Regional Earthquake Likelihood Models (RELM) working group was designed to compare several prospective forecasts of earthquake rates in latitude–longitude–magnitude bins in and around California. This forecast format is being used as a blueprint for many other earthquake predictability experiments around the world, and therefore it is important to consider how to evaluate the performance of such forecasts. Two tests that are currently used are based on the likelihood of the observed distribution of earthquakes given a forecast; one test compares the binned space–rate–magnitude observation and forecast, and the other compares only the rate forecast and the number of observed earthquakes. In this article, we discuss a subtle flaw in the current test of rate forecasts, and we propose two new tests that isolate the spatial and magnitude component, respectively, of a space–rate–magnitude forecast. For illustration, we consider the RELM forecasts and the distribution of earthquakes observed during the first half of the ongoing RELM experiment. We show that a space–rate–magnitude forecast may appear to be consistent with the distribution of observed earthquakes despite the spatial forecast being inconsistent with the spatial distribution of observed earthquakes, and we suggest that these new tests should be used to provide increased detail in earthquake forecast evaluation. We also discuss the statistical power of each of the likelihood-based tests and the stability (with respect to earthquake catalog uncertainties) of results from the likelihood-based tests.


Bulletin of the Seismological Society of America | 2009

Mixture Models for Improved Short-Term Earthquake Forecasting

David A. Rhoades; Matthew C. Gerstenberger

Abstract The short-term earthquake probability (STEP) forecasting model applies the Omori–Utsu aftershock-decay relation and the Gutenberg–Richter frequency-magnitude relation to clusters of earthquakes. It is mainly intended to forecast aftershock activity and depends on a time-invariant background model to forecast most of the major earthquakes. On the other hand, the long-range earthquake forecasting model EEPAS (every earthquake a precursor according to scale) exploits the precursory scale increase phenomenon and associated predictive scaling relations to forecast the major earthquakes months, years, or decades in advance, depending on magnitude. Both models are shown to be more informative than time-invariant models of seismicity. By forming a mixture of the two, we aim to create an even more informative short-term forecasting model. Using the Advanced National Seismic System catalog of California over the period 1984–2004, the optimal mixture model for forecasting earthquakes with M ≥5.0 is a convex linear combination consisting of 0.42 of the EEPAS forecast and 0.58 of the STEP forecast. This mixture gives an average probability gain of more than 2 compared to each of the individual models. Several different mixture models will be submitted to the CSEP Testing Center at the Southern California Earthquake Center to ascertain whether or not this result is borne out by real-time tests of the models against future earthquakes.


Earthquake Spectra | 2014

Seismic Hazard Modeling for the Recovery of Christchurch

Matthew C. Gerstenberger; Graeme H. McVerry; David A. Rhoades; Mark W. Stirling

New time-dependent seismicity models for the Christchurch region reflect the greatly enhanced seismicity in the region at present, and the gradual decrease of the seismicity over the next few decades. These seismicity models, along with modified ground-motion prediction equations and revised hazard calculation procedures have been used to derive new seismic hazard estimates for timeframes from months to 50 years. The hazard estimates have been used for a variety of applications crucial to planning and implementing the recovery of Christchurch. The new model includes higher amplitude spectra for designing new structures and assessing existing ones, magnitude-weighted peak ground acceleration hazard curves that account for duration effects for liquefaction assessment and remediation, and peak ground acceleration curves for evaluating the probabilities of rock falls. Particularly challenging has been the incorporation of time-varying hazard components into the redesign levels.


Earthquake Hazard, Risk and Disasters | 2014

Quantifying Improvements in Earthquake-Rupture Forecasts through Testable Models

Danijel Schorlemmer; Matthew C. Gerstenberger

Abstract The philosophy of Karl Popper suggests that scientific hypotheses should be evaluated in experiments and the results should be used to improve the underlying models. With these improved models, new hypotheses should be formulated and, again, put under test in experiments. Because earthquake-rupture forecast models are societally relevant products of seismological research and they influence public policy making, rigorous testing and evaluation of these models are a must. Recently, interest has been reinvigorated in developing tests for earthquake forecast models in the seismological community with the application of many forecast experiments and long-term testing underway. In this chapter, we describe some philosophies behind testing, the types of earthquake forecasts that are currently in development or under testing, and introduce test metrics and procedures. The difficulties of forecast and experiment development are laid out. In particular, we highlight the importance of testing centers for unbiased and fully prospective experiments, that is, testing earthquake forecasts against future observations. We show how the seismological community is embracing this philosophy and how it is applied to earthquake-rupture models and also to other aspects of seismic hazard assessment.


Seismological Research Letters | 2018

The forecasting skill of physics-based seismicity models during the 2010-2012 Canterbury, New Zealand, earthquake sequence

Camilla Cattania; M. Werner; Warner Marzocchi; Sebastian Hainzl; David A. Rhoades; Matthew C. Gerstenberger; Maria Liukis; William Savran; A. Christophersen; Agnès Helmstetter; Abigail Jiménez; Sandy Steacy; Thomas H. Jordan

1 The static Coulomb stress hypothesis is a widely known physical mechanism for 2 earthquake triggering, and thus a prime candidate for physics-based Operational Earth3 quake Forecasting (OEF). However, the forecast skill of Coulomb-based seismicity mod4 els remains controversial, especially in comparison to empirical statistical models. A 5 previous evaluation by the Collaboratory for the Study of Earthquake Predictabil6 ity (CSEP) concluded that a suite of Coulomb-based seismicity models were less in7 formative than empirical models during the aftershock sequence of the 1992 Mw7.3 8 Landers, California, earthquake. Recently, a new generation of Coulomb-based and 9 Coulomb/statistical hybrid models were developed that account better for uncertainties 10 and secondary stress sources. Here, we report on the performance of this new suite of 11 models in comparison to empirical Epidemic Type Aftershock Sequences (ETAS) mod12 els during the 2010-2012 Canterbury, New Zealand, earthquake sequence. Comprising 13 the 2010 M7.1 Darfield earthquake and three subsequent M ≥ 5.9 shocks (including 14 the February 2011 Christchurch earthquake), this sequence provides a wealth of data 15 (394 M ≥ 3.95 shocks). We assessed models over multiple forecast horizons (1-day, 16 1-month and 1-year, updated after M ≥ 5.9 shocks). The results demonstrate substan17 tial improvements in the Coulomb-based models. Purely physics-based models have a 18 performance comparable to the ETAS model, and the two Coulomb/statistical hybrids 19 perform better or as well as the corresponding statistical model. On the other hand, 20 an ETAS model with anisotropic (fault-based) aftershock zones is just as informative. 21 These results provide encouraging evidence for the predictive power of Coulomb-based 22 models. To assist with model development, we identify discrepancies between forecasts 23 and observations. 24


Science Advances | 2018

Earthquakes drive large-scale submarine canyon development and sediment supply to deep-ocean basins

Joshu J. Mountjoy; Jamie D. Howarth; Alan R. Orpin; Philip M. Barnes; David A. Bowden; Ashley A. Rowden; Alexandre C. G. Schimel; Caroline Holden; Huw J. Horgan; Scott D. Nodder; Jason R. Patton; Geoffroy Lamarche; Matthew C. Gerstenberger; Aaron Micallef; Arne Pallentin; Tim Kane

Coseismic canyon flushing reveals how earthquakes drive canyon development and deep-sea sediment dispersal on active margins. Although the global flux of sediment and carbon from land to the coastal ocean is well known, the volume of material that reaches the deep ocean—the ultimate sink—and the mechanisms by which it is transferred are poorly documented. Using a globally unique data set of repeat seafloor measurements and samples, we show that the moment magnitude (Mw) 7.8 November 2016 Kaikōura earthquake (New Zealand) triggered widespread landslides in a submarine canyon, causing a powerful “canyon flushing” event and turbidity current that traveled >680 km along one of the world’s longest deep-sea channels. These observations provide the first quantification of seafloor landscape change and large-scale sediment transport associated with an earthquake-triggered full canyon flushing event. The calculated interevent time of ~140 years indicates a canyon incision rate of 40 mm year−1, substantially higher than that of most terrestrial rivers, while synchronously transferring large volumes of sediment [850 metric megatons (Mt)] and organic carbon (7 Mt) to the deep ocean. These observations demonstrate that earthquake-triggered canyon flushing is a primary driver of submarine canyon development and material transfer from active continental margins to the deep ocean.


Seismological Research Letters | 2016

Appraising the PSHA Earthquake Source Models of Japan, New Zealand, and Taiwan

Marco Pagani; Ken Xiansheng Hao; Hiroyuki Fujiwara; Matthew C. Gerstenberger; Kuo-Fong Ma

ABSTRACT Earthquake‐hazard models are one of the major contributions provided by the seismological community to tangibly support disaster risk reduction policies at a national level. Although the societal impact of hazard analyses can be huge, the development of models remains a scientific activity developed within frameworks that have a strong national emphasis and partial international recognition. However, broad acceptability of hazard models is a key aspect for achieving authoritativeness and a prerequisite for warranting that their construction is completed using well‐recognized methodologies. As part of an enduring international collaboration between leading organizations operating in the hazard and risk fields in Japan, New Zealand, Taiwan, and the Global Earthquake Model initiative, we discuss the main characteristic of the earthquake source models—as implemented for the OpenQuake engine—used for the calculation of the most recent national seismic‐hazard maps of these three countries. Particular emphasis is placed on comparing the various modeling options adopted in the different tectonic regions, on emphasizing commonalities, and on discussing the most controversial modeling solutions. Despite the many connections from a seismotectonic point of view between the three countries, the comparison highlights different modeling choices for the various tectonic regions, which constitute a spectrum of possible epistemic uncertainties, as well as modeling issues that could be collectively explored in future phases of this international collaboration.


Acta Geophysica | 2011

Efficient testing of earthquake forecasting models

David A. Rhoades; Danijel Schorlemmer; Matthew C. Gerstenberger; A. Christophersen; J. Douglas Zechar; Masajiro Imoto


Bulletin of the Seismological Society of America | 2013

Regional Earthquake Likelihood Models I: First‐Order Results

J. Douglas Zechar; Danijel Schorlemmer; M. Werner; Matthew C. Gerstenberger; David A. Rhoades; Thomas H. Jordan

Collaboration


Dive into the Matthew C. Gerstenberger's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Danijel Schorlemmer

University of Southern California

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Thomas H. Jordan

United States Geological Survey

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Maria Liukis

Jet Propulsion Laboratory

View shared research outputs
Top Co-Authors

Avatar

M. Werner

University of Bristol

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge