Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Gerrit Blacquière is active.

Publication


Featured researches published by Gerrit Blacquière.


Geophysics | 2011

Separation of blended data by iterative estimation and subtraction of blending interference noise

Araz Mahdad; Panagiotis Doulgeris; Gerrit Blacquière

Seismic acquisition is a trade-off between economy and quality. In conventional acquisition the time intervals between successive records are large enough to avoid interference in time. To obtain an efficient survey, the spatial source sampling is therefore often (too) large. However, in blending, or simultaneous acquisition, temporal overlap between shot records is allowed. This additional degree of freedom in survey design significantly improves the quality or the economics or both. Deblending is the procedure of recovering the data as if they were acquired in the conventional, unblended way. A simple least-squares procedure, however, does not remove the interference due to other sources, or blending noise. Fortunately, the character of this noise is different in different domains, e.g., it is coherent in the common source domain, but incoherent in the common receiver domain. This property is used to obtain a considerable improvement. We propose to estimate the blending noise and subtract it from the blended data. The estimate does not need to be perfect because our procedure is iterative. Starting with the least-squares deblended data, the estimate of the blending noise is obtained via the following steps: sort the data to a domain where the blending noise is incoherent; apply a noise suppression filter; apply a threshold to remove the remaining noise, ending up with (part of) the signal; compute an estimate of the blending noise from this signal. At each iteration, the threshold can be lowered and more of the signal is recovered. Promising results were obtained with a simple implementation of this method for both impulsive and vibratory sources. Undoubtedly, in the future algorithms will be developed for the direct processing of blended data. However, currently a high-quality deblending procedure is an important step allowing the application of contemporary processing flows


Geophysics | 2001

Comprehensive assessment of seismic acquisition geometries by focal beams—Part I: Theoretical considerations

A. J. Berkhout; L. Ongkiehong; A. W. F. Volker; Gerrit Blacquière

The starting point for a relatively simple approach to data acquisition design can be found in the common focus point (CFP) philosophy, which describes seismic migration as a double focusing process. The migration output is presented as the combined result of focused source beams and focused detector beams for a given velocity model, revealing the potential amplitude accuracy and spatial resolution of a specific field geometry. In addition, any noise model can be fed into the input, and the subsequent beam-forming operations can be applied to predict the potential noise suppression rate. The economical optimization comes from the possibility of balancing the source and detector efforts as well as the acquisition and processing efforts.


Seg Technical Program Expanded Abstracts | 2008

From Simultaneous Shooting to Blended Acquisition

A.J. Guus Berkhout; Gerrit Blacquière; Eric Verschuur

Seismic acquisition surveys are designed such that the time interval between shots is sufficiently large to avoid the tail of the previous source response to interfere with the next one (zero overlap in time). To economize on survey time and processing effort, the current compromise is to keep the number of shots to some acceptable minimum. The result is that the source domain is poorly sampled. In this paper it is proposed to abandon the condition of non-overlapping shot records. Instead, a plea is made to move to densely sampled and wide-azimuth source distributions with relatively small time intervals between shots (‘blended acquisition’). The underlying rationale is that interpolating missing shot records, i.e., generating data that have not been recorded (aliasing problem), is much harder than separating the data of overlapping shot records (interference problem). In this paper we summarize the principle of blended acquisition and show how to process blended data. Two processing routes can be followed: reconstructing the unblended data (‘deblending’) followed by conventional processing, or directly processing the blended measurements. Both approaches will be described and illustrated with numerical examples. A theoretical framework is presented that enables the design of blended 3D seismic surveys.


Geophysics | 2008

Acquisition geometry analysis in complex 3D media

E. J. van Veldhuizen; Gerrit Blacquière; A. J. Berkhout

Increasingly, we must deal with complex subsurface structures in seismic exploration, often resulting in poor illumination and, therefore, poor image quality. Consequently, it is desirable to take into consideration the effects of wave propagation in the subsurface structure when designing an acquisition geometry. We developed a new, model-based implementation of the previously introduced focal-beam analysis method. The methods objective is to provide quantitative insight into the combined influence of acquisition geometry, overburden structure, and migration operators on image resolution and angle-dependent amplitude accuracy. This is achieved by simulation of migrated grid-point responses using focal beams. Note that the seismic response of any subsurface can be composed of a linear sum of grid-point responses. The focal beams have been chosen because any migration process represents double focusing. In addition, the focal source beam and focal detector beam relate migration quality to illumination properties of the source geometry and sensing properties of the detector geometry, respectively. Wave-equation modeling ensures that frequency-dependent effects in the seismic-frequency range are incorporated. We tested our method by application to a 3D salt model in the Gulf of Mexico. Investigation of well-sampled, all-azimuth, long-offset acquisition geometries revealed fundamental illumination and sensing limitations. Further results exposed the shortcomings of narrow-azimuth data acquisition. The method also demonstrates how acquisition-related amplitude errors affect seismic inversion results.


Geophysics | 2001

Comprehensive assessment of seismic acquisition geometries by focal beams—Part II: Practical aspects and examples

A. W. F. Volker; Gerrit Blacquière; A. J. Berkhout; L. Ongkiehong

The acquisition geometry of a 3-D seismic survey should be designed in such a way that it allows high‐quality images and fulfills economical constraints. Focal source beams and focal detector beams are used to analyze and design field geometries. An attractive property is that the focal source and detector beams can be computed and evaluated separately. Seismic quality attributes such as resolution, noise suppression rate, angle‐averaged amplitude, and angle‐dependent amplitude information can be obtained efficiently from those beams. Examples confirm the theory that source and detector distributions may complement each other to achieve a high resolution. This important property can be used to design cost‐effective geometries. The acquisition design criteria for reliable amplitude versus ray parameter information turn out to be more stringent and may require a compromise in resolution.


Geophysics | 2009

The concept of double blending: Combining incoherent shooting with incoherent sensing

A. J. Berkhout; Gerrit Blacquière; D. J. Verschuur

Seismic surveys are designed so that the time interval between shots is sufficiently large to avoid temporal overlap between records. To economize on survey time, the current compromise is to keep the number of shots to an acceptable minimum. The result is a poorly sampled source domain. We propose to abandon the condition of nonoverlapping shot records to allow densely sampled, wide-azimuth source distributions (source blending). The rationale is that interpolation is much harder than separation. Source blending has significant implications for quality (source density) and economics (survey time). In addition to source blending, detector blending is introduced by which every channel records a superposition of detected signals, each with its own particular code. With detector blending, many more detectors can be used for the same number of recording channels. This is particularly beneficial when the number of detectors is very large (mass sensoring) or the number of channels is limited (wireless recording). The concept of double blending is defined as the case in which both source blending and detector blending are applied. Double blending allows a significant trace-compression factor during acquisition.


internaltional ultrasonics symposium | 2004

A 20-40 MHz ultrasound transducer for intravascular harmonic imaging

Hendrik J. Vos; M.E. Frijlink; E. Droog; David E. Goertz; Gerrit Blacquière; Andries Gisolf; N. de Jong; A.F.W. van der Steen

Recent studies have suggested the feasibility of tissue harmonic imaging (THI) with intravascular ultrasound (IVUS). This paper describes the design, fabrication and characterization of a piezoelectric transducer optimized for tissue harmonic IVUS. Ideally, such a transducer should efficiently transmit a short acoustic pulse at the fundamental transmission frequency and should be sensitive to its second harmonic echo, for which we have chosen 20 MHz and 40 MHz, respectively. The intravascular application limits the transducer dimensions to 0.75 mm by 1 mm. The transducer comprises of a single piezoelectric layer design with additional passive layers for tuning and efficiency improvement, and the Krimholtz-Leedom-Matthaei (KLM) model was used to find iteratively optimal material properties of the different layers. Based on the optimized design a prototype of the transducer was built. The transducer was characterized by water-tank hydrophone measurements and pulse-echo measurements. These measurements showed the transducer to have two frequency bands around 20 MHz and 40 MHz with -6dB fractional bandwidths of 30% and 25%, and round-trip insertion losses of -19 dB and -34 dB, respectively.


Geophysical Prospecting | 2016

Deghosting by echo-deblending

A. J. Berkhout; Gerrit Blacquière

A marine source generates both a direct wavefield and a ghost wavefield. This is caused by the strong surface reflectivity, resulting in a blended source array, the blending process being natural. The two unblended response wavefields correspond to the real source at the actual location below the water level and to the ghost source at the mirrored location above the water level. As a consequence, deghosting becomes deblending (‘echo-deblending’) and can be carried out with a deblending algorithm. In this paper we present source deghosting by an iterative deblending algorithm that properly includes the angle dependence of the ghost: It represents a closed-loop, non-causal solution. The proposed echo-deblending algorithm is also applied to the detector deghosting problem. The detector cablemay be slanted, and shot recordsmay be generated by blended source arrays, the blending being created by simultaneous sources. Similar to surface-related multiple elimination the method is independent of the complexity of the subsurface; only what happens at and near the surface is relevant. This means that the actual sea state may cause the reflection coefficient to become frequency dependent, and the water velocity may not be constant due to temporal and lateral variations in the pressure, temperature, and salinity. As a consequence, we propose that estimation of the actual ghost model should be part of the echo-deblending algorithm. This is particularly true for source deghosting, where interaction of the source wavefield with the surface may be far from linear. The echo-deblending theory also shows how multi-level source acquisition and multilevel streamer acquisition can be numerically simulated from standard acquisition data. The simulated multi-level measurements increase the performance of the echodeblending process. The output of the echo-deblending algorithm on the source side consists of two ghost-free records: one generated by the real source at the actual location below the water level and one generated by the ghost source at the mirrored location above the water level. If we apply our algorithm at the detector side as well, we end up with four ghost-free shot records. All these records are input to migration. Finally, we demonstrate that the proposed echo-deblending algorithm is robust for background noise.


Geophysical Prospecting | 2016

3-D surface-wave estimation and separation using a closed-loop approach

Tomohide Ishiyama; Gerrit Blacquière; Eric Verschuur; Wim A. Mulder

Surface waves in seismic data are often dominant in a land or shallow-water environment. Separating them from primaries is of great importance either for removing them as noise for reservoir imaging and characterization or for extracting them as signal for near-surface characterization. However, their complex properties make the surface-wave separation significantly challenging in seismic processing. To address the challenges, we propose a method of three-dimensional surface-wave estimation and separation using an iterative closed-loop approach. The closed loop contains a relatively simple forward model of surface waves and adaptive subtraction of the forward-modelled surface waves from the observed surface waves, making it possible to evaluate the residual between them. In this approach, the surface-wave model is parameterized by the frequency-dependent slowness and source properties for each surface-wave mode. The optimal parameters are estimated in such a way that the residual is minimized and, consequently, this approach solves the inverse problem. Through real data examples, we demonstrate that the proposed method successfully estimates the surface waves and separates them out from the seismic data. In addition, it is demonstrated that our method can also be applied to undersampled, irregularly sampled, and blended seismic data.


Seg Technical Program Expanded Abstracts | 2009

Survey design for blended acquisition

Gerrit Blacquière; A. J. Berkhout; D. J. Verschuur

When designing an acquisition geometry for the case of source blending, it is important that the blended source arrays are capable of transmitting a wavefield with a large spatial and temporal bandwidth: the wavefield must be incoherent. We call this incoherent shooting. Furthermore, the array must illuminate the target by a wide range of angles. In this paper two quantitative quality measures are introduced: a measure for incoherency, based on the autocorrelation of the blended source array at the surface, and a measure for illumination being based on the properties of the focal source beam in the subsurface. The two measures have their equivalence at the detector side: a measure for incoherency, based on the autocorrelation of the blended detector array at the surface and a measure for sensing being based on the properties of the focal detector beam in the subsurface.

Collaboration


Dive into the Gerrit Blacquière's collaboration.

Top Co-Authors

Avatar

A. J. Berkhout

Delft University of Technology

View shared research outputs
Top Co-Authors

Avatar

D. J. Verschuur

Delft University of Technology

View shared research outputs
Top Co-Authors

Avatar

Eric Verschuur

Delft University of Technology

View shared research outputs
Top Co-Authors

Avatar

Tomohide Ishiyama

Delft University of Technology

View shared research outputs
Top Co-Authors

Avatar

A. W. F. Volker

Delft University of Technology

View shared research outputs
Top Co-Authors

Avatar

Amarjeet Kumar

Delft University of Technology

View shared research outputs
Top Co-Authors

Avatar

Panagiotis Doulgeris

Delft University of Technology

View shared research outputs
Top Co-Authors

Avatar

Guus Berkhout

Delft University of Technology

View shared research outputs
Top Co-Authors

Avatar

Araz Mahdad

Delft University of Technology

View shared research outputs
Top Co-Authors

Avatar

E. J. van Veldhuizen

Delft University of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge