Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where David A. Rhoades is active.

Publication


Featured researches published by David A. Rhoades.


Bulletin of the Seismological Society of America | 2012

National Seismic Hazard Model for New Zealand: 2010 Update

Mark W. Stirling; Graeme H. McVerry; Matthew C. Gerstenberger; Nicola Litchfield; Russ Van Dissen; Kelvin Berryman; Philip M. Barnes; Laura M. Wallace; Pilar Villamor; Robert Langridge; Geoffroy Lamarche; Scott D. Nodder; Martin Reyners; Brendon A. Bradley; David A. Rhoades; Warwick Smith; A. Nicol; Jarg R. Pettinga; Kate Clark; Katrina Jacobs

A team of earthquake geologists, seismologists, and engineering seis- mologists has collectively produced an update of the national probabilistic seismic hazard (PSH) model for New Zealand (National Seismic Hazard Model, or NSHM). The new NSHM supersedes the earlier NSHM published in 2002 and used as the hazard basis for the New Zealand Loadings Standard and numerous other end-user applica- tions. The new NSHM incorporates a fault source model that has been updated with over 200 new onshore and offshore fault sources and utilizes new New Zealand-based and international scaling relationships for the parameterization of the faults. The dis- tributed seismicity model has also been updated to include post-1997 seismicity data, a new seismicity regionalization, and improved methodology for calculation of the seismicity parameters. Probabilistic seismic hazard maps produced from the new NSHM show a similar pattern of hazard to the earlier model at the national scale, but there are some significant reductions and increases in hazard at the regional scale. The national-scale differences between the new and earlier NSHM appear less than those seen between much earlier national models, indicating that some degree of consis- tency has been achieved in the national-scale pattern of hazard estimates, at least for return periods of 475 years and greater. Online Material: Table of fault source parameters for the 2010 national seismic- hazard model.


Bulletin of the Seismological Society of America | 2002

Comparison of Earthquake Scaling Relations Derived from Data of the Instrumental and Preinstrumental Era

Mark W. Stirling; David A. Rhoades; Kelvin Berryman

Estimates of surface rupture displacement and magnitude for crustal earthquakes from the preinstrumental era (pre-1900) tend to be greater than the corresponding estimates derived from modern scaling relations. We investigate this tendency using an expanded and updated version of the earthquake dataset of Wells and Coppersmith (1994) to fit regression relations of moment magnitude on surface rupture length and rupture area and average surface displacement on surface rupture length. Separate relations are fitted to preinstrumental and instrumental data and the results compared to the equivalent relations of Wells and Coppersmith. We find that our relations for instrumental data remove some, but not all, of the differences between the preinstrumental data and the relations of Wells and Coppersmith. We attribute the remaining differences largely to natural censoring of surface displacements less than about 1 m and surface rupture lengths less than about 5 km from the dataset for the preinstrumental era because regressions constructed from similarly censored instrumental data are indistinguishable from the preinstrumental regressions. Since the regressions for our censored instrumental data (i.e., restricted to moderate to large earthquakes) are different from regressions for our complete dataset of instrumental earthquakes and from the regressions of Wells and Coppersmith (both with a larger proportion of small-to-moderate earthquakes), the results may indicate that large earthquakes have different scaling relationships from those of smaller earthquakes.


Bulletin of the Seismological Society of America | 2010

Likelihood-Based Tests for Evaluating Space-Rate-Magnitude Earthquake Forecasts

J. Douglas Zechar; Matthew C. Gerstenberger; David A. Rhoades

Abstract The five-year experiment of the Regional Earthquake Likelihood Models (RELM) working group was designed to compare several prospective forecasts of earthquake rates in latitude–longitude–magnitude bins in and around California. This forecast format is being used as a blueprint for many other earthquake predictability experiments around the world, and therefore it is important to consider how to evaluate the performance of such forecasts. Two tests that are currently used are based on the likelihood of the observed distribution of earthquakes given a forecast; one test compares the binned space–rate–magnitude observation and forecast, and the other compares only the rate forecast and the number of observed earthquakes. In this article, we discuss a subtle flaw in the current test of rate forecasts, and we propose two new tests that isolate the spatial and magnitude component, respectively, of a space–rate–magnitude forecast. For illustration, we consider the RELM forecasts and the distribution of earthquakes observed during the first half of the ongoing RELM experiment. We show that a space–rate–magnitude forecast may appear to be consistent with the distribution of observed earthquakes despite the spatial forecast being inconsistent with the spatial distribution of observed earthquakes, and we suggest that these new tests should be used to provide increased detail in earthquake forecast evaluation. We also discuss the statistical power of each of the likelihood-based tests and the stability (with respect to earthquake catalog uncertainties) of results from the likelihood-based tests.


New Zealand Journal of Geology and Geophysics | 1993

The precursory earthquake swarm in New Zealand: Hypothesis tests

F. F. Evison; David A. Rhoades

Abstract A series of tests is being carried out with the object of determining whether the relations evident in New Zealand and Japan, between precursory swarms and major earthquakes, are of value for long‐range, synoptic earthquake forecasting. The first New Zealand test (completed in 1990) showed that clustering, both of swarms and of mainshock events, should be allowed for, and the hypothesis was reformulated accordingly. The second New Zealand test, now completed, confirms the importance of clustering. It also reveals that the applicability of the swarm/mainshock relations is strongly affected by large‐scale tectonics. Further, a simulation study shows that, at the end of the test, the performance of the swarm hypothesis relative to the Poisson model lay between what would be expected if the hypothesis and the Poisson model, respectively, were correct in general. These results support a further reformulation of the hypothesis, which is now taken to be strongly applicable in the Hikurangi and Fiordland...


New Zealand Journal of Geology and Geophysics | 2003

Estimates of the time-varying hazard of rupture of the Alpine Fault, New Zealand, allowing for uncertainties

David A. Rhoades; R. Van Dissen

Abstract The time‐varying hazard of rupture of the Alpine Fault is estimated using a renewal process model and a statistical method that takes account of uncertainties in data and parameter values. Four different recurrence‐time distributions are considered. The central and southern sections of the fault are treated separately. Data inputs are based on estimates of the long‐term slip rate, the average single‐event displacement, and the dates of earthquakes that have occurred in the last 1000 yr from previous studies of fault traces, landslide and terrace records, and forest ages and times of disturbance. Using these data and associated uncertainties, the current hazard of rupture on the central section of the fault is estimated to be 0.0051, 0.010, 0.012, and 0.0073 events per year under the exponential, lognormal, Weibull, and inverse Gaussian recurrence‐time distributions, respectively. The corresponding probabilities of rupture in the next 20 yr are 10, 18, 21, and 14%, respectively. The current hazard on the southern section of the fault is estimated to be 0.0033, 0.0075, 0.0070, and 0.0053 events per year for the four models, and the 20 yr probabilities 6, 14, 13, and 10%, respectively. Increased precision in the date of the second to last event on the southern section of the fault would result in only small changes to these rates and probabilities. The indicated hazard under the lognormal model is about double the long‐term average rate but less than half of that estimated in previous studies that did not take account of all the uncertainties. Dating additional prehistoric ruptures is likely to have a greater effect on the hazard estimates than improved precision in the existing data.


Bulletin of the Seismological Society of America | 2009

Mixture Models for Improved Short-Term Earthquake Forecasting

David A. Rhoades; Matthew C. Gerstenberger

Abstract The short-term earthquake probability (STEP) forecasting model applies the Omori–Utsu aftershock-decay relation and the Gutenberg–Richter frequency-magnitude relation to clusters of earthquakes. It is mainly intended to forecast aftershock activity and depends on a time-invariant background model to forecast most of the major earthquakes. On the other hand, the long-range earthquake forecasting model EEPAS (every earthquake a precursor according to scale) exploits the precursory scale increase phenomenon and associated predictive scaling relations to forecast the major earthquakes months, years, or decades in advance, depending on magnitude. Both models are shown to be more informative than time-invariant models of seismicity. By forming a mixture of the two, we aim to create an even more informative short-term forecasting model. Using the Advanced National Seismic System catalog of California over the period 1984–2004, the optimal mixture model for forecasting earthquakes with M ≥5.0 is a convex linear combination consisting of 0.42 of the EEPAS forecast and 0.58 of the STEP forecast. This mixture gives an average probability gain of more than 2 compared to each of the individual models. Several different mixture models will be submitted to the CSEP Testing Center at the Southern California Earthquake Center to ascertain whether or not this result is borne out by real-time tests of the models against future earthquakes.


Journal of Geophysical Research | 1994

On the handling of uncertainties in estimating the hazard of rupture on a fault segment

David A. Rhoades; R. Van Dissen; D. J. Dowrick

Uncertainties in data and parameter values have often been ignored in hazard estimates based on historic and prehistoric records of rupture on fault segments. A mixture of distributions approach is appropriate to handle uncertainties in parameters of recurrence time distributions estimated from the geological and historical earthquake record of a fault segment, and a mixture of hazards approach is appropriate for data uncertainties and for uncertainties in parameters estimated from a set of similar faults. The former approach admits updating of the distributions for uncertainty as time passes. The aim is to present the hazard as a single value which takes account of both data and parameter uncertainties, conditional only on modeling assumptions. The proposed methods are described in detail for the exponential and lognormal recurrence time models for fault-rupturing earthquakes and applied, by way of illustration, to selected fault segments, namely, the Mojave segment of the San Andreas fault, California, and the Wellington-Hutt Valley segment of the Wellington fault, New Zealand.


Earthquake Spectra | 2014

Determining Rockfall Risk in Christchurch Using Rockfalls Triggered by the 2010–2011 Canterbury Earthquake Sequence

Chris Massey; Mauri J. McSaveney; Tony Taig; Laurie Richards; Nicola Litchfield; David A. Rhoades; Graeme H. McVerry; Biljana Lukovic; David Heron; William Ries; Russ Van Dissen

The Canterbury earthquake sequence triggered thousands of rockfalls in the Port Hills of Christchurch, New Zealand, with over 6,000 falling on 22 February 2011. Several hundred families were evacuated after about 200 homes were hit. We characterized the rockfalls by boulder-size distribution, runout distance, source-area dimensions, and boulder-production rates over a range of triggering peak ground accelerations. Using these characteristics, a time-varying seismic hazard model for Canterbury, and estimates of residential occupancy rates and resident vulnerability, we estimated annual individual fatality risk from rockfall in the Port Hills. The results demonstrate the Port Hills rockfall risk is time-variable, decreasing as the seismic hazard decreases following the main earthquakes in February and June 2011. This presents a real challenge for formulating robust land-use and reconstruction policy in the Port Hills.


Earthquake Spectra | 2014

Seismic Hazard Modeling for the Recovery of Christchurch

Matthew C. Gerstenberger; Graeme H. McVerry; David A. Rhoades; Mark W. Stirling

New time-dependent seismicity models for the Christchurch region reflect the greatly enhanced seismicity in the region at present, and the gradual decrease of the seismicity over the next few decades. These seismicity models, along with modified ground-motion prediction equations and revised hazard calculation procedures have been used to derive new seismic hazard estimates for timeframes from months to 50 years. The hazard estimates have been used for a variety of applications crucial to planning and implementing the recovery of Christchurch. The new model includes higher amplitude spectra for designing new structures and assessing existing ones, magnitude-weighted peak ground acceleration hazard curves that account for duration effects for liquefaction assessment and remediation, and peak ground acceleration curves for evaluating the probabilities of rock falls. Particularly challenging has been the incorporation of time-varying hazard components into the redesign levels.


Physics of the Earth and Planetary Interiors | 1998

LONG-TERM SEISMOGENIC PROCESS FOR MAJOR EARTHQUAKES IN SUBDUCTION ZONES

F. F. Evison; David A. Rhoades

Abstract A qualitative physical process for the long-term seismogenesis of major earthquakes in subduction zones is proposed on the basis of quantitative empirical evidence that swarms, mainshocks and aftershocks are closely related phenomena. The relations, which have been identified in the comprehensive, long-term catalogues of New Zealand and Japan, represent swarms as predictors of mainshocks with respect to location, time and magnitude. Clustering of swarms and of mainshock/aftershock events is allowed for. With a database of 15 sequences of swarms, mainshocks and aftershocks, tests are being conducted with the object of refining the relations and evaluating them as a possible means of practical synoptic forecasting. Three sequences have culminated in major earthquakes since the tests began, and the systematic study now relates a total of 36 swarms with 29 mainshock/aftershock events. These empirical results strengthen and quantify the connection between swarms and major earthquakes, which several authors have demonstrated by means of numerical/physical modelling. The proposed seismogenic process includes swarms, mainshocks and aftershocks as separate event stages which are related by predictability. Interevent conditions are specified according to the Mogi criteria for the medium; cracks at which fractures subsequently occur constitute nonuniformity in the Mogi sense, and post-earthquake healing restores uniformity. Where the Gutenberg–Richter relation occurs, it is accepted as possible evidence of deterministic chaos and unpredictability; as a corollary, the process is noncyclical. The principle of scaling is held to apply except when modified by large-scale boundaries in the medium. Subduction zones and some other localities where water is abundant are indicated by the main empirical studies as favourable to the occurrence of swarms. Fluid overpressuring is therefore proposed as a mechanism for the self-triggering of swarms, and this is supported by additional examples of the predictive relations occurring in conditions of high fluid pressure, including the vicinity of large man-made reservoirs. The process can be tested by systematic studies in other subduction regions, given adequate catalogues for quantifying the algorithm. It also has implications for other tectonic environments, with swarms replaced by cognate, more protracted seismicity precursors.

Collaboration


Dive into the David A. Rhoades's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

F. F. Evison

Victoria University of Wellington

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

M. Werner

University of Bristol

View shared research outputs
Top Co-Authors

Avatar

Thomas H. Jordan

University of Southern California

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

John X. Zhao

Southwest Jiaotong University

View shared research outputs
Researchain Logo
Decentralizing Knowledge