Martin D. Buhmann
University of Giessen
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Martin D. Buhmann.
Acta Numerica | 2003
Martin D. Buhmann; M. D. Buhmann
Radial basis function methods are modern ways to approximate multivariate functions, especially in the absence of grid data. They have been known, tested and analysed for several years now and many positive properties have been identified. This paper gives a selective but up-to-date survey of several recent developments that explains their usefulness from the theoretical point of view and contributes useful new classes of radial basis function. We consider particularly the new results on convergence rates of interpolation with radial basis functions, as well as some of the various achievements on approximation on spheres, and the efficient numerical computation of interpolants for very large sets of data. Several examples of useful applications are stated at the end of the paper.
Mathematics of Computation | 2001
Martin D. Buhmann
Radial basis functions are well-known and successful tools for the interpolation of data in many dimensions. Several radial basis functions of compact support that give rise to nonsingular interpolation problems have been proposed, and in this paper we study a new, larger class of smooth radial functions of compact support which contains other compactly supported ones that were proposed earlier in the literature.
Stochastic Environmental Research and Risk Assessment | 2013
Emilio Porcu; Daryl J. Daley; Martin D. Buhmann; Moreno Bevilacqua
Matrix-valued radially symmetric covariance functions (also called radial basis functions in the numerical analysis literature) are crucial for the analysis, inference and prediction of Gaussian vector-valued random fields. This paper provides different methodologies for the construction of matrix-valued mappings that are positive definite and compactly supported over the sphere of a d-dimensional space, of a given radius. In particular, we offer a representation based on scaled mixtures of Askey functions; we also suggest a method of construction based on B-splines. Finally, we show that the very appealing convolution arguments are indeed effective when working in one dimension, prohibitive in two and feasible, but substantially useless, when working in three dimensions. We exhibit the statistical performance of the proposed models through simulation study and then discuss the computational gains that come from our constructions when the parameters are estimated via maximum likelihood. We finally apply our constructions to a North American Pacific Northwest temperatures dataset.
Journal of Approximation Theory | 2015
Martin D. Buhmann; Feng Dai
We consider radial basis function approximations using at first a localization of the basis functions known as quasi-interpolation (to be contrasted to the plain linear combinations of shifts of radial basis functions or for instance cardinal interpolation). Using these quasi-interpolants we derive various pointwise error estimates in L p for p ? 1 , ∞ ) .
Journal of Approximation Theory | 2001
Martin D. Buhmann; Oleg Davydov; Tim N. T. Goodman
The purpose of this paper is the construction of bi- and trivariate prewavelets from box-spline spaces, i.e., piecewise polynomials of fixed degree on a uniform mesh. They have especially small support and form Riesz bases of the wavelet spaces, so they are stable. In particular, the supports achieved are smaller than those of the prewavelets due to Riemenschneider and Shen in a recent, similar construction.
Foundations of Computational Mathematics | 2003
Martin D. Buhmann; Oleg Davydov; Tim N. T. Goodman
Abstract.Dedicated to Professor M. J. D. Powell on the occasionof his sixty-fifth birthday and his retirement. In this paper, we design differentiable, two-dimensional, piecewise polynomial cubic prewavelets of particularly small compact support. They are given in closed form, and provide stable, orthogonal decompositions of L2 (R2) . In particular, the splines we use in our prewavelet constructions give rise to stable bases of spline spaces that contain all cubic polynomials, whereas the more familiar box spline constructions cannot reproduce all cubic polynomials, unless resorting to a box spline of higher polynomial degree.
Clinical Neurophysiology | 2016
Janin Jäger; Alexander Klein; Martin D. Buhmann; Wolfgang Skrandies
OBJECTIVE In this paper we introduce a new interpolation method to use for scalp potential interpolation. The predictive value of this new interpolation technique (the multiquadric method) is compared to commonly used interpolation techniques like nearest-neighbour averaging and spherical splines. METHODS The method of comparison is cross-validation, where the data of one or two electrodes is predicted by the rest of the data. The difference between the predicted and the measured data is used to determine two error measures. One is the maximal error in one interpolation technique and the other is the mean square error. The methods are tested on data stemming from 30 channel EEG of 10 healthy volunteers. RESULTS The multiquadric interpolation methods performed best regarding both error measures and have been easier to calculate than spherical splines. CONCLUSION Multiquadrics are a good alternative to commonly used EEG reconstruction methods. SIGNIFICANCE Multiquadrics have been widely used in reconstruction on sphere-like surfaces, but until now, the advantages have not been investigated in EEG reconstruction.
Advances in Computational Mathematics | 2006
Martin D. Buhmann
This paper concerns the interpolation with radial basis functions on half-spaces, where the centres are multi-integers restricted to half-spaces as well. The existence of suitable Lagrange functions is shown for multiquadrics and inverse multiquadrics radial basis functions, as well as the decay rate and summability of its coefficients. The main technique is a so-called Wiener–Hopf factorisation of the symbol of the radial basis function and the careful study of the smoothness of its 2π-periodic factors.
Biographical Memoirs of Fellows of the Royal Society | 2018
Martin D. Buhmann; Roger Fletcher; Arieh Iserles; Philippe L. Toint
Michael James David Powell was a British numerical analyst who was among the pioneers of computational mathematics. During a long and distinguished career, first at the Atomic Energy Research Establishment (AERE) Harwell and subsequently as the John Humphrey Plummer Professor of Applied Numerical Analysis in Cambridge, he contributed decisively towards establishing optimization theory as an effective tool of scientific enquiry, replete with highly effective methods and mathematical sophistication. He also made crucial contributions to approximation theory, in particular to the theory of spline functions and of radial basis functions. In a subject that roughly divides into practical designers of algorithms and theoreticians who seek to underpin algorithms with solid mathematical foundations, Mike Powell refused to follow this dichotomy. His achievements span the entire range from difficult and intricate convergence proofs to the design of algorithms and production of software. He was among the leaders of a subject area that is at the nexus of mathematical enquiry and applications throughout science and engineering.
Journal of Approximation Theory | 2017
Martin D. Buhmann; Oleg Davydov
While it was noted by R. Hardy and proved in a famous paper by C. A. Micchelli that radial basis function interpolants s(x)=j(xxj) exist uniquely for the multiquadric radial function (r)=r2+c2 as soon as the (at least two) centres are pairwise distinct, the error bounds for this interpolation problem always demanded an added constant to s. By using Pontryagin native spaces, we obtain error bounds that no longer require this additional constant expression.