Scott MacKay
WesternGeco
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Scott MacKay.
Geophysics | 2003
Scott MacKay; Jonathan G. Fried; Charles Carvill
During marine seismic acquisition, obtaining complete subsurface coverage may require combining data from different acquisition dates. The time gaps between the overlapping coverage may vary from hours separating subsequent boat passes, to months when large surveys are acquired in sections. Time-lapse data are an extreme example of overlapping data sets acquired at widely varying acquisition dates. Unfortunately, between the different times of acquisition, changes in physical ocean properties, such as temperature or salinity, can cause variations in water velocity. The result is a dynamic change in recorded traveltimes that makes accurate combination of the data difficult. In shallow water, the distortions are small and do not affect data quality. However, in deepwater, the cumulative distortions can pose a serious impediment to accurate imaging. The existence of water-velocity variations has been documented previously (Barley, 1999). Water temperature changes are the primary cause of velocity variations. Figure 1 shows an area just south of Nova Scotia (coastline in red). The outlined seismic survey area is approximately 3600 km2. The satellite images show surface temperature variations, with each color contour representing 1°C. In the approximately two-week period shown, surface temperatures varied as much as 10°. Figure 1. Surface temperature variations offshore Nova Scotia; each color contour is 1°C. The survey area is outlined. In the approximately two-week period shown, up to 10° of temperature variation can be seen. The temperature structures evident in Figure 1 are caused by eddies in the Gulf Stream and are indicative of deepwater temperature variations. Significantly, each degree of change causes over 3 m/s of water-velocity variation. The effect of such changes on seismic data collected in deepwater can be significant. Figure 2 shows two midpoint gathers after moveout correction. The gathers are from the Nova Scotia survey outlined in Figure 1. The shallowest event is the …
Seg Technical Program Expanded Abstracts | 2006
Scott MacKay; Héctor Ramírez Jiménez; Jorge Jorge Martín Romero; Mark Morford
Summary Mis-ties between well depths and prestack depth migrated seismic images are almost universal. Anisotropy is the most commonly invoked explanation of depthing discrepancies. However, underlying problems with the quality of the seismic interpretation and the well control can also cause significant mis-tie problems. In Rojano et al (2005) we described an iterative approach to velocity-model building oriented towards the integration of seismic and well data in mature fields. In this paper, we describe an extension to this approach that allows for the
Geophysics | 1994
Scott MacKay; Bill Dragoset
The salt‐injection features that underlie the Sigsbee Escarpment of the Gulf of Mexico have received much attention due to the proven hydrocarbon potential and the classic imaging pitfalls encountered beneath salt. In addition to creating raybending problems, the salt is also a source of significant multiple energy. The multiples generated off the top and base of salt add to the existing problems of water‐bottom multiples.
Seg Technical Program Expanded Abstracts | 2011
Scott MacKay
During the relatively velocity-insensitive process of time imaging, the interpreter often relinquishes process ing oversight to the processor or third-party “bird dog s” with little interpretive background. However, as depth i maging grows in importance, it has become clear that it ca nnot be considered a “product”. Depth imaging is intimately linked with the interpretative process. Therefore, the int erpreter must be invested in the QC process and be prepared to guide it. This paper discusses a methodology for de pthimaging QC that establishes an appropriate dialogue between the interpreter and the processor. Effectiv communication is critical if a risk-mitigating dept h volume is to be formed.
Seg Technical Program Expanded Abstracts | 2002
Albena Mateeva; Douglas I. Hart; Scott MacKay
For the purposes of predictive deconvolution one assumes that the seismic trace results from the convolution of a reflectivity series with a wavelet. A fundamental component of the seismic wavelet model is the intrinsic absorption of the earth that causes loss of high frequencies to anelastic processes during propagation. Another potentially important component of the wavelet model is the apparent attenuation caused by short-period multiples. Here we examine whether these two processes act similarly enough to be combined into a single “effective attenuation” operator for signal processing purposes. We conclude that they can be combined except in cyclic depositional environments that contain many high reflection coefficients, when the apparent attenuation operator can no longer be accurately modeled as minimum phase.
Seg Technical Program Expanded Abstracts | 2005
Alfredo Marhx Rojano; Jaime Estrada Garcia; Scott MacKay; Lynne Goodoff; Douglas S. Hamilton; Poza Rica
Summary We introduce an iterative approach to velocity-model building that accommodates the problematic nature of integrating seismic and well data in mature fields. An initial velocity model may be created by combining seismic velocities and check shots. The model is then calibrated by combining interpreted seismic horizons with the equivalent well tops. The calibrated velocity model is then smoothed over twice the nominal well spacing. The smoothing is intended to introduce the constraint of geologic consistency to the model. Next begins an iterative approach to model refinement. The smoothed model is recalibrated with the well tops and depth errors between the two are used to flag problematic wells or seismic data. In mature fields, it is common to encounter incorrect well postings or inconsistencies in the interpreted tops. Seismic data may have acquisition problems, variations in the correlation, or local velocity changes that contribute to the error. During the iterations, depth errors must be reconciled. Well positions are confirmed, tops reviewed, and seismic interpretations reevaluated. As the sources of error are reduced, and eventually become small and random, the smoothing radius is reduced to the nominal well separation.
Geophysics | 2004
Scott MacKay; Terry Young
A broad and diverse Technical Program is scheduled for Denver, created from over 750 abstracts received for review. Technical Sessions will run from Monday to Thursday, and Convention Workshops will run Thursday through Friday.
Geophysics | 1994
Ronald W.Ward; Stephen M. Greenlee; Scott MacKay; Carlos Dengo
In this paper we review the evolution of geologic thought and geophysical technology that preceded the research workshop “Imaging Sedimentary Structures Under Salt” at SEG’s 1993 SEG Annual International Meeing in Washington, DC. Two of us (RWW and SWN) organized the workshop. This special issue of TLE contains some of the presentations. The workshop had the largest attendance of any at the convention, indicating the high interest in subsalt exploration.
Seg Technical Program Expanded Abstracts | 1993
Bill Dragoset; Scott MacKay
Surface multiple attenuation (SMA) is a prestack, f-x domain inversion of a surface-recorded, 2-D wavefield that removes all orders of all surface multiples present within the wavefield. In addition, the process statistically determines the average acquisition wavelet. Neither of these abilities of SMA requires any assumptions regarding the positions, shapes, or reflection coefficients of the multiple-causing reflectors. Instead, SMA relies on the physical consistency between the primary and multiple events that exists in any properly recorded seismic data set. The wavefield inversion equation derives from a Kirchhoff-integral representation of the relationship between multiple events and primary events. SMA, applied to data recorded over a Gulf Coast salt-injection feature, successfully attenuated both water bottom and salt-interface multiples that contaminated reflections from the subsalt structures. With the multiples removed, imaging of the subsalt structures and subsequent interpretation became a much simpler process.
Archive | 2001
Scott MacKay