Donald L. Turcotte
University of California
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Donald L. Turcotte.
Archive | 2007
Donald L. Turcotte; Sergey G. Abaimov; Robert Shcherbakov; John B. Rundle
In this paper we consider the nonlinear dynamics of several natural hazards and related models. We will focus our attention on earthquakes, landslides, and forest fires. These are clearly complex phenomena but they exhibit self organization. A consequence of this self organization is scaling laws. We will consider frequency-magnitude statistics and recurrence-time statistics. The frequency-magnitude distributions are power-law and we give a cascade model of cluster coalescence to explain this behavior. The return-time distributions are well approximated by the Weibull distribution. An important characteristic of the Weibull distribution is that it is the only distribution that has a power-law hazard function.
Pure and Applied Geophysics | 2016
James R. Holliday; William R. Graves; John B. Rundle; Donald L. Turcotte
Large devastating events in systems such as earthquakes, typhoons, market crashes, electricity grid blackouts, floods, droughts, wars and conflicts, and landslides can be unexpected and devastating. Events in many of these systems display frequency-size statistics that are power laws. Previously, we presented a new method for calculating probabilities for large events in systems such as these. This method counts the number of small events since the last large event and then converts this count into a probability by using a Weibull probability law. We applied this method to the calculation of large earthquake probabilities in California-Nevada, USA. In that study, we considered a fixed geographic region and assumed that all earthquakes within that region, large magnitudes as well as small, were perfectly correlated. In the present article, we extend this model to systems in which the events have a finite correlation length. We modify our previous results by employing the correlation function for near mean field systems having long-range interactions, an example of which is earthquakes and elastic interactions. We then construct an application of the method and show examples of computed earthquake probabilities.
Pure and Applied Geophysics | 2013
Mark R. Yoder; Jordan Van Aalsburg; Donald L. Turcotte; Sergey G. Abaimov; John B. Rundle
Aftershock statistics provide a wealth of data that can be used to better understand earthquake physics. Aftershocks satisfy scale-invariant Gutenberg–Richter (GR) frequency–magnitude statistics. They also satisfy Omori’s law for power-law seismicity rate decay and Båth’s law for maximum-magnitude scaling. The branching aftershock sequence (BASS) model, which is the scale-invariant limit of the epidemic-type aftershock sequence model (ETAS), uses these scaling laws to generate synthetic aftershock sequences. One objective of this paper is to show that the branching process in these models satisfies Tokunaga branching statistics. Tokunaga branching statistics were originally developed for drainage networks and have been subsequently shown to be valid in many other applications associated with complex phenomena. Specifically, these are characteristic of a universality class in statistical physics associated with diffusion-limited aggregation. We first present a deterministic version of the BASS model and show that it satisfies the Tokunaga side-branching statistics. We then show that a fully stochastic BASS simulation gives similar results. We also study foreshock statistics using our BASS simulations. We show that the frequency–magnitude statistics in BASS simulations scale as the exponential of the magnitude difference between the mainshock and the foreshock, inverse GR scaling. We also show that the rate of foreshock occurrence in BASS simulations decays inversely with the time difference between foreshock and mainshock, an inverse Omori scaling. Both inverse scaling laws have been previously introduced empirically to explain observed foreshock statistics. Observations have demonstrated both of these scaling relations to be valid, consistent with our simulations. ETAS simulations, in general, do not generate Båth’s law and do not generate inverse GR scaling.
Pure and Applied Geophysics | 2015
J. Quinn Norris; Donald L. Turcotte; John B. Rundle
Hydraulic fracturing (fracking), using high pressures and a low viscosity fluid, allow the extraction of large quantiles of oil and gas from very low permeability shale formations. The initial production of oil and gas at depth leads to high pressures and an extensive distribution of natural fractures which reduce the pressures. With time these fractures heal, sealing the remaining oil and gas in place. High volume fracking opens the healed fractures allowing the oil and gas to flow to horizontal production wells. We model the injection process using invasion percolation. We use a 2D square lattice of bonds to model the sealed natural fractures. The bonds are assigned random strengths and the fluid, injected at a point, opens the weakest bond adjacent to the growing cluster of opened bonds. Our model exhibits burst dynamics in which the clusters extend rapidly into regions with weak bonds. We associate these bursts with the microseismic activity generated by fracking injections. A principal object of this paper is to study the role of anisotropic stress distributions. Bonds in the y-direction are assigned higher random strengths than bonds in the x-direction. We illustrate the spatial distribution of clusters and the spatial distribution of bursts (small earthquakes) for several degrees of anisotropy. The results are compared with observed distributions of microseismicity in a fracking injection. Both our bursts and the observed microseismicity satisfy Gutenberg–Richter frequency-size statistics.
Pure and Applied Geophysics | 2017
John Max Wilson; Mark R. Yoder; John B. Rundle; Donald L. Turcotte; Kasey W. Schultz
In this paper, we address the problem of verifying earthquake simulators with observed data. Earthquake simulators are a class of computational simulations which attempt to mirror the topological complexity of fault systems on which earthquakes occur. In addition, the physics of friction and elastic interactions between fault elements are included in these simulations. Simulation parameters are adjusted so that natural earthquake sequences are matched in their scaling properties. Physically based earthquake simulators can generate many thousands of years of simulated seismicity, allowing for a robust capture of the statistical properties of large, damaging earthquakes that have long recurrence time scales. Verification of simulations against current observed earthquake seismicity is necessary, and following past simulator and forecast model verification methods, we approach the challenges in spatial forecast verification to simulators; namely, that simulator outputs are confined to the modeled faults, while observed earthquake epicenters often occur off of known faults. We present two methods for addressing this discrepancy: a simplistic approach whereby observed earthquakes are shifted to the nearest fault element and a smoothing method based on the power laws of the epidemic-type aftershock (ETAS) model, which distributes the seismicity of each simulated earthquake over the entire test region at a decaying rate with epicentral distance. To test these methods, a receiver operating characteristic plot was produced by comparing the rate maps to observed
Pure and Applied Geophysics | 2018
Molly Luginbuhl; John B. Rundle; Angela Hawkins; Donald L. Turcotte
Pure and Applied Geophysics | 2018
Molly Luginbuhl; John B. Rundle; Donald L. Turcotte
m>6.0
Archive | 2015
John B. Rundle; Donald L. Turcotte
Acta Geophysica | 2012
Ya-Ting Lee; Donald L. Turcotte; John B. Rundle; Chien-Chih Chen
m>6.0 earthquakes in California since 1980. We found that the nearest-neighbor mapping produced poor forecasts, while the ETAS power-law method produced rate maps that agreed reasonably well with observations.
Pure and Applied Geophysics | 2008
S.G. Abaimov; Donald L. Turcotte; Robert Shcherbakov; John B. Rundle; Gleb Yakovlev; C. Goltz; W.I. Newman
Nowcasting is a new method of statistically classifying seismicity and seismic risk (Rundle et al. 2016). In this paper, the method is applied to the induced seismicity at the Geysers geothermal region in California and the induced seismicity due to fluid injection in Oklahoma. Nowcasting utilizes the catalogs of seismicity in these regions. Two earthquake magnitudes are selected, one large say