Bryan Singer
Carnegie Mellon University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Bryan Singer.
ieee international conference on high performance computing data and analytics | 2004
Markus Püschel; José M. F. Moura; Bryan Singer; Jianxin Xiong; Jeremy R. Johnson; David A. Padua; Manuela M. Veloso; Robert W. Johnson
SPIRAL is a generator for libraries of fast software implementations of linear signal processing transforms. These libraries are adapted to the computing platform and can be re-optimized as the hardware is upgraded or replaced. This paper describes the main components of SPIRAL: the mathematical framework that concisely describes signal transforms and their fast algorithms; the formula generator that captures at the algorithmic level the degrees of freedom in expressing a particular signal processing transform; the formula translator that encapsulates the compilation degrees of freedom when translating a specific algorithm into an actual code implementation; and, finally, an intelligent search engine that finds within the large space of alternative formulas and implementations the “best” match to the given computing platform. We present empirical data that demonstrate the high performance of SPIRAL generated code.
international conference on computational science | 2001
Markus Püschel; Bryan Singer; Manuela M. Veloso; José M. F. Moura
SPIRAL is a generator of optimized, platform-adapted libraries for digital signal processing algorithms. SPIRALs strategy translates the implementation task into a search in an expanded space of alternatives. These result from the many degrees of freedom in the DSP algorithm itself and in the various coding choices. This paper describes the framework to represent and generate efficiently these alternatives: the formula generator module in SPIRAL. We also address the search module that works in tandem with the formula generator in a feedback loop to find optimal implementations. These modules are implemented using the computer algebra system GAP/AREP.
conference on high performance computing (supercomputing) | 2001
Bryan Singer; Manuela M. Veloso
This paper presents an evolutionary algorithm for searching for the optimal implementations of signal transforms and compares this approach against other search techniques. A single signal processing algorithm can be represented by a very large number of different but mathematically equivalent formulas. When these formulas are implemented in actual code, unfortunately their running times differ signi.cantly. Signal processing algorithm optimization aims at finding the fastest formula. We present a new approach that successfully solves this problem, using an evolutionary stochastic search algorithm, STEER, to search through the very large space of formulas. We empirically compare STEER against other search methods, showing that it notably can find faster formulas while still only timing a very small portion of the search space.
IEEE Transactions on Signal Processing | 2002
Bryan Singer; Manuela M. Veloso
Fast implementations of discrete signal transforms, such as the discrete Fourier transform (DFT), the Walsh-Hadamard transform (WHT), and the discrete trigonometric transforms (DTTs), can be viewed as factorizations of their corresponding transformation matrices. A given signal transform can have many different factorizations, with each factorization represented by a unique but mathematically equivalent formula. When implemented in code, these formulas can have significantly different running times on the same processor, sometimes differing by an order of magnitude. Further, the optimal implementations on various processors are often different. Given this complexity, a crucial problem is automating the modeling and optimization of the performance of signal transform implementations. To enable computer modeling of signal processing performance, we have developed and analyzed more than 15 feature sets to describe formulas representing specific transforms. Using some of these features and a limited set of training data, we have successfully trained neural networks to learn to accurately predict performance of formulas with error rates less than 5%. In the direction of optimization, we have developed a new stochastic evolutionary algorithm known as STEER that finds fast implementations of a variety of signal transforms. STEER is able to optimize completely new transforms specified by a user. We present results that show that STEER can find discrete cosine transform formulas that are 10-20% faster than what a dynamic programming search finds.
Proceedings of the IEEE | 2005
Markus Püschel; José M. F. Moura; Jeremy R. Johnson; David A. Padua; Manuela M. Veloso; Bryan Singer; Jianxin Xiong; Franz Franchetti; Aca Gacic; Yevgen Voronenko; Kang Chen; Robert W. Johnson; Nicholas Rizzolo
international conference on machine learning | 2000
Bryan Singer; Manuela M. Veloso
Journal of Machine Learning Research | 2003
Bryan Singer; Manuela M. Veloso
international conference on machine learning | 2001
Bryan Singer; Manuela M. Veloso
national conference on artificial intelligence | 1999
Bryan Singer; Manuela M. Veloso
Archive | 2000
Bryan Singer; Manuela M. Veloso