Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Felix J. Herrmann is active.

Publication


Featured researches published by Felix J. Herrmann.


Geophysics | 2008

Simply denoise: Wavefield reconstruction via jittered undersampling

Gilles Hennenfent; Felix J. Herrmann

We present a new, discrete undersampling scheme designed to favor wavefield reconstruction by sparsity-promoting inversion with transform elements localized in the Fourier domain. The work is motivated by empirical observations in the seismic community, corroborated by results from compressive sampling, that indicate favorable (wavefield) reconstructions from random rather than regular undersampling. Indeed, random undersampling renders coherent aliases into harmless incoherent random noise, effectively turning the interpolation problem into a much simpler denoising problem. A practical requirement of wavefield reconstruction with localized sparsifying transforms is the control on the maximum gap size. Unfortunately, random undersampling does not provide such a control. Thus, we introduce a sampling scheme, termed jittered undersampling, that shares the benefits of random sampling and controls the maximum gap size. The contribution of jittered sub-Nyquist sampling is key in formu-lating a versatile wavefi...


Computing in Science and Engineering | 2006

Seismic denoising with nonuniformly sampled curvelets

Gilles Hennenfent; Felix J. Herrmann

The authors present an extension of the fast discrete curvelet transform (FDCT) to nonuniformly sampled data. This extension not only restores curvelet compression rates for nonuniformly sampled data but also removes noise and maps the data to a regular grid


Geophysics | 2008

Curvelet-based seismic data processing: A multiscale and nonlinear approach

Felix J. Herrmann; Deli Wang; Gilles Hennenfent; Peyman P. Moghaddam

Mitigating missing data, multiples, and erroneous migration amplitudes are key factors that determine image quality. Curvelets, little “plane waves,” complete with oscillations in one direction and smoothness in the other directions, sparsify a property we leverage explicitly with sparsity promotion. With this principle, we recover seismic data with high fidelity from a small subset (20%) of randomly selected traces. Similarly, sparsity leads to a natural decorrelation and hence to a robust curvelet-domain primary-multiple separation for North Sea data. Finally, sparsity helps to recover migration amplitudes from noisy data. With these examples, we show that exploiting the curvelets ability to sparsify wavefrontlike features is powerful, and our results are a clear indication of the broad applicability of this transform to exploration seismology.


Geophysics | 2010

Randomized sampling and sparsity: Getting more information from fewer samples

Felix J. Herrmann

Many seismic exploration techniques rely on the collection of massive data volumes that are subsequently mined for information during processing. Although this approach has been extremely successful in the past, current efforts toward higher-resolution images in increasingly complicated regions of the earth continue to reveal fundamental shortcomings in our workflows. Chiefly among these is the so-called “curse of dimensionality” exemplified by Nyquist’s sampling criterion, which disproportionately strains current acquisition and processing systems as the size and desired resolution of our survey areas continue to increase. We offer an alternative sampling method leveraging recent insights from compressive sensing toward seismic acquisition and processing for data that are traditionally considered to be undersampled. The main outcome of this approach is a new technology where acquisition and processing related costs are no longer determined by overly stringent sampling criteria, such asNyquist. At the hea...


Geophysics | 2009

Compressive simultaneous full-waveform simulation

Felix J. Herrmann; Yogi A. Erlangga; Tim T.Y. Lin

The fact that the computational complexity of wavefield simulation is proportional to the size of the discretized model and acquisition geometry and not to the complexity of the simulated wavefield is a major impediment within seismic imaging. By turning simulation into a compressive sensing problem, where simulated data are recovered from a relatively small number of independent simultaneous sources, we remove this impediment by showing that compressively sampling a simulation is equivalent to compressively sampling the sources, followed by solving a reduced system. As in compressive sensing, this reduces sampling rate and hence simulation costs. We demonstrate this principle for the time-harmonic Helmholtz solver. The solution is computed by inverting the reduced system, followed by recovering the full wavefield with a program that promotes sparsity. Depending on the wavefields sparsity, this approach can lead to significant cost reductions, particularly when combined with the implicit preconditioned Helmholtz solver, which is known to converge even for decreasing mesh sizes and increasing angular frequencies. These properties make our scheme a viable alternative to explicit time-domain finite differences.


Siam Journal on Optimization | 2012

An Effective Method for Parameter Estimation with PDE Constraints with Multiple Right-Hand Sides

Eldad Haber; Matthias Chung; Felix J. Herrmann

Often, parameter estimation problems of parameter-dependent PDEs involve multiple right-hand sides. The computational cost and memory requirements of such problems increase linearly with the number...


Geophysics | 2007

Compressed wavefield extrapolation

Tim T.Y. Lin; Felix J. Herrmann

An explicit algorithm for the extrapolation of one-way wavefields is proposed that combines recent developments in information theory and theoretical signal processing with the physics of wave propagation. Because of excessive memory requirements, explicit formulations for wave propagation have proven to be a challenge in 3D. By using ideas from compressed sensing, we are able to formulate the (inverse) wavefield extrapolation problem on small subsets of the data volume, thereby reducing the size of the operators. Compressed sensing entails a new paradigm for signal recovery that provides conditions under which signals can be recovered from incomplete samplings by nonlinear recovery methods that promote sparsity of the to-be-recovered signal. According to this theory, signals can be successfully recovered when the measurement basis is incoherent with the representation in which the wavefield is sparse. In this new approach, the eigenfunctions of the Helmholtz operator are recognized as a basis that is incoherent with curvelets that are known to compress seismic wavefields. By casting the wavefield extrapolation problem in this framework, wavefields can be successfully extrapolated in the modal domain, despite evanescent wave modes. The degree to which the wavefield can be recovered depends on the number of missing (evanescent) wavemodes and on the complexity of the wavefield. A proof of principle for the compressed sensing method is given for inverse wavefield extrapolation in 2D, together with a pathway to 3D during which the multiscale and multiangular properties of curvelets, in relation to the Helmholz operator, are exploited. The results show that our method is stable, has reduced dip limitations, and handles evanescent waves in inverse extrapolation.


Geophysics | 2010

Nonequispaced curvelet transform for seismic data reconstruction: A sparsity-promoting approach

Gilles Hennenfent; Lloyd Fenelon; Felix J. Herrmann

We extend our earlier work on the nonequispaced fast discrete curvelet transform (NFDCT) and introduce a second generation of the transform. This new generation differs from the previous one by the approach taken to compute accurate curvelet coefficients from irregularly sampled data. The first generation relies on accurate Fourier coefficients obtained by an l2 -regularized inversion of the nonequispaced fast Fourier transform (FFT) whereas the second is based on a direct l1 -regularized inversion of the operator that links curvelet coefficients to irregular data. Also, by construction the second generation NFDCT is lossless unlike the first generation NFDCT. This property is particularly attractive for processing irregularly sampled seismic data in the curvelet domain and bringing them back to their irregular record-ing locations with high fidelity. Secondly, we combine the second generation NFDCT with the standard fast discrete curvelet transform (FDCT) to form a new curvelet-based method, coined noneq...


IEEE Signal Processing Magazine | 2012

Fighting the Curse of Dimensionality: Compressive Sensing in Exploration Seismology

Felix J. Herrmann; Michael P. Friedlander; Ozgur Yilmaz

Many seismic exploration techniques rely on the collection of massive data volumes that are mined for information during processing. This approach has been extremely successful, but current efforts toward higher resolution images in increasingly complicated regions of Earth continue to reveal fundamental shortcomings in our typical workflows. The “curse” of dimensionality is the main roadblock and is exemplified by Nyquists sampling criterion, which disproportionately strains current acquisition and processing systems as the size and desired resolution of our survey areas continues to increase.


Geophysics | 2008

New insights into one-norm solvers from the Pareto curve

Gilles Hennenfent; Ewout van den Berg; Michael P. Friedlander; Felix J. Herrmann

Geophysical inverse problems typically involve a trade-off between data misfit and some prior model. Pareto curves trace the optimal trade-off between these two competing aims. These curves are used commonly in problems with two-norm priors in which they are plotted on a log-log scale and are known as L-curves. For other priors, such as the sparsity-promoting one-norm prior, Pareto curves remain relatively unexplored. We show how these curves lead to new insights into one-norm regularization. First, we confirm theoretical properties of smoothness and convexity of these curves from a stylized and a geophysical example. Second, we exploit these crucial properties to approximate the Pareto curve for a large-scale problem. Third, we show how Pareto curves provide an objective criterion to gauge how different one-norm solvers advance toward the solution.

Collaboration


Dive into the Felix J. Herrmann's collaboration.

Top Co-Authors

Avatar

Rajiv Kumar

University of British Columbia

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Tim T.Y. Lin

University of British Columbia

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Peyman P. Moghaddam

University of British Columbia

View shared research outputs
Top Co-Authors

Avatar

Haneet Wason

University of British Columbia

View shared research outputs
Top Co-Authors

Avatar

Ernie Esser

University of British Columbia

View shared research outputs
Top Co-Authors

Avatar

Mathias Louboutin

University of British Columbia

View shared research outputs
Top Co-Authors

Avatar

Ning Tu

University of British Columbia

View shared research outputs
Researchain Logo
Decentralizing Knowledge