Ronald R. Coifman
Yale University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Ronald R. Coifman.
IEEE Transactions on Information Theory | 1992
Ronald R. Coifman; Mladen Victor Wickerhauser
Adapted waveform analysis uses a library of orthonormal bases and an efficiency functional to match a basis to a given signal or family of signals. It permits efficient compression of a variety of signals, such as sound and images. The predefined libraries of modulated waveforms include orthogonal wavelet-packets and localized trigonometric functions, and have reasonably well-controlled time-frequency localization properties. The idea is to build out of the library functions an orthonormal basis relative to which the given signal or collection of signals has the lowest information cost. The method relies heavily on the remarkable orthogonality properties of the new libraries: all expansions in a given library conserve energy and are thus comparable. Several cost functionals are useful; one of the most attractive is Shannon entropy, which has a geometric interpretation in this context. >
Archive | 1994
Ronald R. Coifman; Yves Meyer; Steven Quake; M. Victor Wickerhauser
Wavelet packets are a versatile collection of functions generalizing the compactly supported wavelets of Daubechies. They are used to analyze and manipulate signals such as sound and images. We describe a library of such waveforms and demonstrate a few of their analytic properties. We also describe an algorithm to chose a best basis subset, tailored to fit a specific signal or class of signals. We apply this algorithm to two signal processing tasks: acoustic signal compression, and feature extraction in certain images.
Journal of Mathematical Imaging and Vision | 1995
Naoki Saito; Ronald R. Coifman
We describe an extension to the “best-basis” method to select an orthonormal basis suitable for signal/image classification problems from a large collection of orthonormal bases consisting of wavelet packets or local trigonometric bases. The original best-basis algorithm selects a basis minimizing entropy from such a “library of orthonormal bases” whereas the proposed algorithm selects a basis maximizing a certain discriminant measure (e.g., relative entropy) among classes. Once such a basis is selected, a small number of most significant coordinates (features) are fed into a traditional classifier such as Linear Discriminant Analysis (LDA) or Classification and Regression Tree (CARTTM). The performance of these statistical methods is enhanced since the proposed methods reduce the dimensionality of the problem at hand without losing important information for that problem. Here, the basis functions which are well-localized in the time-frequency plane are used as feature extractors. We applied our method to two signal classification problems and an image texture classification problem. These experiments show the superiority of our method over the direct application of these classifiers on the input signals. As a further application, we also describe a method to extract signal component from data consisting of signal and textured background.
Optical Engineering | 1994
Ronald R. Coifman; Mladen Victor Wickerhauser
We describe the development of adapted waveform analysis (AWA) as a tool for fast processing of the various identification tasks involved in medical diagnostics and automatic target recognition. Such tasks consist of steps: representing the signal as a superposition of component functions, choosing to retain some of the components and discard the others, then reconstructing a new, approximate signal from what was kept. AWA provides tools for each of these steps, accelerating the decomposition and reconstruction computations, providing new functions for analysis and modeling, and extracting new features for recognition and classification. AWA extends Fourier analysis by providing new libraries of standard waveforms with properties akin to windowed sines and cosines, and it extends principal component analysis and eigen-function expansions by adapting the standard functions to individual operators. The cost of representing a function can be measured by how many components must be superposed to obtain a desired degree of approximation, and this cost can be minimized by a fast search through the library of representations. The analysis can be iterated to sift coherent signals from noise. We consider applications to signal and image compression, feature detection, and medical image denoising.
Archive | 2011
Ronald R. Coifman; Matan Gavish
Digital databases can be represented by matrices, where rows (say) correspond to numerical sensors readings, or features, and columns correspond to data points. Recent data analysis methods describe the local geometry of the data points using a weighted affinity graph, whose vertices correspond to data points. We consider two geometries, or graphs – one on the rows and one on the columns, such that the data matrix is smooth with respect to the “tensor product” of the two geometries. This is achieved by an iterative procedure that constructs a multiscale partition tree on each graph. We use the recently introduced notion of Haar-like bases induced by the trees to obtain Tensor-Haar-like bases for the space of matrices, and show that an l p entropy conditions on the expansion coefficients of the database, viewed as a function on the product of the geometries, imply both smoothness and efficient reconstruction. We apply this methodology to analyze, de-noise and compress a term-document database. We use the same methodology to compress matrices of potential operators of unknown charge distribution geometries and to organize Laplacian eigenvectors, where the data matrix is the “expansion in Laplace eigenvectors” operator.
Archive | 2004
Carey E. Priebe; David J. Marchette; Youngser Park; Edward J. Wegman; Jeffrey L. Solka; Diego A. Socolinsky; Damianos Karakos; Kenneth Ward Church; Roland Guglielmi; Ronald R. Coifman; Dekang Lin; Dennis M. Healy; Marc Q. Jacobs; Anna Tsao
We consider the problem of statistical pattern recognition in a heterogeneous, high-dimensional setting. In particular, we consider the search for meaningful cross-category associations in a heterogeneous text document corpus. Our approach involves “iterative denoising ” — that is, iteratively extracting (corpus-dependent) features and partitioning the document collection into sub-corpora. We present an anecdote wherein this methodology discovers a meaningful cross-category association in a heterogeneous collection of scientific documents.
Bulletin of the American Mathematical Society | 1977
Ronald R. Coifman; Guido Weiss
Signal processing Part I | 1990
Ronald R. Coifman
Studia Mathematica | 1974
Ronald R. Coifman
Archive | 1994
Naoki Saito; Ronald R. Coifman