Marco Selig
Max Planck Society
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Marco Selig.
Physical Review E | 2013
Tiago Ramalho; Marco Selig; Ulrich Gerland; T. A. Enßlin
The simulation of complex stochastic network dynamics arising, for instance, from models of coupled biomolecular processes remains computationally challenging. Often, the necessity to scan a models dynamics over a large parameter space renders full-fledged stochastic simulations impractical, motivating approximation schemes. Here we propose an approximation scheme which improves upon the standard linear noise approximation while retaining similar computational complexity. The underlying idea is to minimize, at each time step, the Kullback-Leibler divergence between the true time evolved probability distribution and a Gaussian approximation (entropic matching). This condition leads to ordinary differential equations for the mean and the covariance matrix of the Gaussian. For cases of weak nonlinearity, the method is more accurate than the linear method when both are compared to stochastic simulations.
Astronomy and Astrophysics | 2013
Marco Selig; M. R. Bell; H. Junklewitz; Niels Oppermann; M. Reinecke; Maksim Greiner; Carlos Pachajoa; T. A. Enßlin
NIFTy, “Numerical Information Field Theory”, is a software package designed to enable the development of signal inference algorithms that operate regardless of the underlying spatial grid and its resolution. Its object-oriented framework is written in Python, although it accesses libraries written in Cython, C++, and C for eciency. NIFTy oers a toolkit that abstracts discretized representations of continuous spaces, fields in these spaces, and operators acting on fields into classes. Thereby, the correct normalization of operations on fields is taken care of automatically without concerning the user. This allows for an abstract formulation and programming of inference algorithms, including those derived within information field theory. Thus, NIFTy permits its user to rapidly prototype algorithms in 1D, and then apply the developed code in higher-dimensional settings of real world problems. The set of spaces on which NIFTy operates comprises point sets, n-dimensional regular grids, spherical spaces, their harmonic counterparts, and product spaces constructed as combinations of those. The functionality and diversity of the package is demonstrated by a Wiener filter code example that successfully runs without modification regardless of the space on which the inference problem is defined.
Physical Review E | 2012
Marco Selig; Niels Oppermann; T. A. Enßlin
Estimating the diagonal entries of a matrix, that is not directly accessible but only available as a linear operator in the form of a computer routine, is a common necessity in many computational applications, especially in image reconstruction and statistical inference. Here, methods of statistical inference are used to improve the accuracy or the computational costs of matrix probing methods to estimate matrix diagonals. In particular, the generalized Wiener filter methodology, as developed within information field theory, is shown to significantly improve estimates based on only a few sampling probes, in cases in which some form of continuity of the solution can be assumed. The strength, length scale, and precise functional form of the exploited autocorrelation function of the matrix diagonal is determined from the probes themselves. The developed algorithm is successfully applied to mock and real world problems. These performance tests show that, in situations where a matrix diagonal has to be calculated from only a small number of computationally expensive probes, a speedup by a factor of 2 to 10 is possible with the proposed method.
Astronomy and Astrophysics | 2016
H. Junklewitz; M. R. Bell; Marco Selig; T. A. Enßlin
We present resolve, a new algorithm for radio aperture synthesis imaging of extended and diffuse emission in total intensity. The algorithm is derived using Bayesian statistical inference techniques, estimating the surface brightness in the sky assuming a priori log-normal statistics. resolve estimates the measured sky brightness in total intensity, and the spatial correlation structure in the sky, which is used to guide the algorithm to an optimal reconstruction of extended and diffuse sources. During this process, the algorithm succeeds in deconvolving the effects of the radio interferometric point spread function. Additionally, resolve provides a map with an uncertainty estimate of the reconstructed surface brightness. Furthermore, with resolve we introduce a new, optimal visibility weighting scheme that can be viewed as an extension to robust weighting. In tests using simulated observations, the algorithm shows improved performance against two standard imaging approaches for extended sources, Multiscale-CLEAN and the Maximum Entropy Method.
Physical Review E | 2013
Niels Oppermann; Marco Selig; M. R. Bell; T. A. Enßlin
We develop a method to infer log-normal random fields from measurement data affected by Gaussian noise. The log-normal model is well suited to describe strictly positive signals with fluctuations whose amplitude varies over several orders of magnitude. We use the formalism of minimum Gibbs free energy to derive an algorithm that uses the signals correlation structure to regularize the reconstruction. The correlation structure, described by the signals power spectrum, is thereby reconstructed from the same data set. We show that the minimization of the Gibbs free energy, corresponding to a Gaussian approximation to the posterior marginalized over the power spectrum, is equivalent to the empirical Bayes ansatz, in which the power spectrum is fixed to its maximum a posteriori value. We further introduce a prior for the power spectrum that enforces spectral smoothness. The appropriateness of this prior in different scenarios is discussed and its effects on the reconstructions results are demonstrated. We validate the performance of our reconstruction algorithm in a series of one- and two-dimensional test cases with varying degrees of non-linearity and different noise levels.
Astronomy and Astrophysics | 2015
Marco Selig; Valentina Vacca; Niels Oppermann; T. A. Enßlin
We analyze the 6.5 year all-sky data from the Fermi Large Area Telescope that are restricted to γ -ray photons with energies between 0.6–307.2 GeV. Raw count maps show a superposition of diffuse and point-like emission structures and are subject to shot noise and instrumental artifacts. Using the D 3 PO inference algorithm, we modeled the observed photon counts as the sum of a diffuse and a point-like photon flux, convolved with the instrumental beam and subject to Poissonian shot noise. The D 3 PO algorithm performs a Bayesian inference without the use of spatial or spectral templates; that is, it removes the shot noise, deconvolves the instrumental response, and yields separate estimates for the two flux components. The non-parametric reconstruction uncovers the morphology of the diffuse photon flux up to several hundred GeV. We present an all-sky spectral index map for the diffuse component. We show that the diffuse γ -ray flux can be described phenomenologically by only two distinct components: a soft component, presumably dominated by hadronic processes, tracing the dense, cold interstellar medium, and a hard component, presumably dominated by leptonic interactions, following the hot and dilute medium and outflows such as the Fermi bubbles. A comparison of the soft component with the Galactic dust emission indicates that the dust-to-soft-gamma ratio in the interstellar medium decreases with latitude. The spectrally hard component exists in a thick Galactic disk and tends to flow out of the Galaxy at some locations. Furthermore, we find the angular power spectrum of the diffuse flux to roughly follow a power law with an index of 2.47 on large scales, independent of energy. Our first catalog of source candidates includes 3106 candidates of which we associate 1381 (1897) with known sources from the second (third) Fermi source catalog. We observe γ -ray emission in the direction of a few galaxy clusters hosting known radio halos.
Astronomy and Astrophysics | 2016
Valentina Vacca; Niels Oppermann; T. A. Enßlin; Jens Jasche; Marco Selig; Maksim Greiner; H. Junklewitz; M. Reinecke; M. Brüggen; E. Carretti; L. Feretti; C. Ferrari; Christopher A. Hales; Cathy Horellou; Shinsuke Ideguchi; M. Johnston-Hollitt; R. Pizzo; H. J. A. Röttgering; T. W. Shimwell; Keitaro Takahashi
Determining magnetic field properties in different environments of the cosmic large-scale structure as well as their evolution over redshift is a fundamental step toward uncovering the origin of cosmic magnetic fields. Radio observations permit the study of extragalactic magnetic fields via measurements of the Faraday depth of extragalactic radio sources. Our aim is to investigate how much different extragalactic environments contribute to the Faraday depth variance of these sources. We develop a Bayesian algorithm to distinguish statistically Faraday depth variance contributions intrinsic to the source from those due to the medium between the source and the observer. In our algorithm the Galactic foreground and measurement noise are taken into account as the uncertainty correlations of the Galactic model. Additionally, our algorithm allows for the investigation of possible redshift evolution of the extragalactic contribution. This work presents the derivation of the algorithm and tests performed on mock observations. Because cosmic magnetism is one of the key science projects of the new generation of radio interferometers, we have predicted the performance of our algorithm on mock data collected with these instruments. According to our tests, high-quality catalogs of a few thousands of sources should already enable us to investigate magnetic fields in the cosmic structure.
Physical Review D | 2013
Sebastian Dorn; Niels Oppermann; Rishi Khatri; Marco Selig; T. A. Enßlin
We present an approximate calculation of the full Bayesian posterior probability distribution for the local non-Gaussianity parameter
Physical Review E | 2015
Sebastian Dorn; T. A. Enßlin; Maksim Greiner; Marco Selig; Vanessa Boehm
f_{\text{nl}}
Physical Review E | 2014
T. A. Enßlin; H. Junklewitz; Lars Winderling; Maksim Greiner; Marco Selig
from observations of cosmic microwave background anisotropies within the framework of information field theory. The approximation that we introduce allows us to dispense with numerically expensive sampling techniques. We use a novel posterior validation method (DIP test) in cosmology to test the precision of our method. It transfers inaccuracies of the calculated posterior into deviations from a uniform distribution for a specially constructed test quantity. For this procedure we study toy cases that use one- and two-dimensional flat skies, as well as the full spherical sky. We find that we are able to calculate the posterior precisely under a flat-sky approximation, albeit not in the spherical case. We argue that this is most likely due to an insufficient precision of the used numerical implementation of the spherical harmonic transform, which might affect other non-Gaussianity estimators as well. Furthermore, we present how a nonlinear reconstruction of the primordial gravitational potential on the full spherical sky can be obtained in principle. Using the flat-sky approximation, we find deviations for the posterior of