Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Michael S. Lewicki is active.

Publication


Featured researches published by Michael S. Lewicki.


Neural Computation | 2000

Learning Overcomplete Representations

Michael S. Lewicki; Terrence J. Sejnowski

In an overcomplete basis, the number of basis vectors is greater than the dimensionality of the input, and the representation of an input is not a unique combination of basis vectors. Overcomplete representations have been advocated because they have greater robustness in the presence of noise, can be sparser, and can have greater flexibility in matching structure in the data. Overcomplete codes have also been proposed as a model of some of the response properties of neurons in primary visual cortex. Previous work has focused on finding the best representation of a signal using a fixed overcomplete basis (or dictionary). We present an algorithm for learning an overcomplete basis by viewing it as probabilistic model of the observed data. We show that overcomplete bases can yield a better approximation of the underlying statistical distribution of the data and can thus lead to greater coding efficiency. This can be viewed as a generalization of the technique of independent component analysis and provides a method for Bayesian reconstruction of signals in the presence of noise and for blind source separation when there are more sources than mixtures.


Network: Computation In Neural Systems | 1998

A review of methods for spike sorting: the detection and classification of neural action potentials

Michael S. Lewicki

The detection of neural spike activity is a technical challenge that is a prerequisite for studying many types of brain function. Measuring the activity of individual neurons accurately can be difficult due to large amounts of background noise and the difficulty in distinguishing the action potentials of one neuron from those of others in the local area. This article reviews algorithms and methods for detecting and classifying action potentials, a problem commonly referred to as spike sorting. The article first discusses the challenges of measuring neural activity and the basic issues of signal detection and classification. It reviews and illustrates algorithms and techniques that have been applied to many of the problems in spike sorting and discusses the advantages and limitations of each and the applicability of these methods for different types of experimental demands. The article is written both for the physiologist wanting to use simple methods that will improve experimental yield and minimize the selection biases of traditional techniques and for those who want to apply or extend more sophisticated algorithms to meet new experimental challenges.


Nature Neuroscience | 2002

Efficient coding of natural sounds.

Michael S. Lewicki

The auditory system encodes sound by decomposing the amplitude signal arriving at the ear into multiple frequency bands whose center frequencies and bandwidths are approximately exponential functions of the distance from the stapes. This organization is thought to result from the adaptation of cochlear mechanisms to the animals auditory environment. Here we report that several basic auditory nerve fiber tuning properties can be accounted for by adapting a population of filter shapes to encode natural sounds efficiently. The form of the code depends on sound class, resembling a Fourier transformation when optimized for animal vocalizations and a wavelet transformation when optimized for non-biological environmental sounds. Only for the combined set does the optimal code follow scaling characteristics of physiological data. These results suggest that auditory nerve fibers encode a broad set of natural sounds in a manner consistent with information theoretic principles.


IEEE Signal Processing Letters | 1999

Blind source separation of more sources than mixtures using overcomplete representations

Te-Won Lee; Michael S. Lewicki; Mark A. Girolami; Terrence J. Sejnowski

Empirical results were obtained for the blind source separation of more sources than mixtures using a previously proposed framework for learning overcomplete representations. This technique assumes a linear mixing model with additive noise and involves two steps: (1) learning an overcomplete representation for the observed data and (2) inferring sources given a sparse prior on the coefficients. We demonstrate that three speech signals can be separated with good fidelity given only two mixtures of the three signals. Similar results were obtained with mixtures of two speech signals and one music signal.


Nature | 2006

Efficient auditory coding

James L. McClelland; Michael S. Lewicki; Evan Smith

The auditory neural code must serve a wide range of auditory tasks that require great sensitivity in time and frequency and be effective over the diverse array of sounds present in natural acoustic environments. It has been suggested that sensory systems might have evolved highly efficient coding strategies to maximize the information conveyed to the brain while minimizing the required energy and neural resources. Here we show that, for natural sounds, the complete acoustic waveform can be represented efficiently with a nonlinear model based on a population spike code. In this model, idealized spikes encode the precise temporal positions and magnitudes of underlying acoustic features. We find that when the features are optimized for coding either natural sounds or speech, they show striking similarities to time-domain cochlear filter estimates, have a frequency-bandwidth dependence similar to that of auditory nerve fibres, and yield significantly greater coding efficiency than conventional signal representations. These results indicate that the auditory code might approach an information theoretic optimum and that the acoustic structure of speech might be adapted to the coding capacity of the mammalian auditory system.


Journal of The Optical Society of America A-optics Image Science and Vision | 1999

Probabilistic framework for the adaptation and comparison of image codes

Michael S. Lewicki; Bruno A. Olshausen

We apply a Bayesian method for inferring an optimal basis to the problem of finding efficient image codes for natural scenes. The basis functions learned by the algorithm are oriented and localized in both space and frequency, bearing a resemblance to two-dimensional Gabor functions, and increasing the number of basis functions results in a greater sampling density in position, orientation, and scale. These properties also resemble the spatial receptive fields of neurons in the primary visual cortex of mammals, suggesting that the receptive-field structure of these neurons can be accounted for by a general efficient coding principle. The probabilistic framework provides a method for comparing the coding efficiency of different bases objectively by calculating their probability given the observed data or by measuring the entropy of the basis function coefficients. The learned bases are shown to have better coding efficiency than traditional Fourier and wavelet bases. This framework also provides a Bayesian solution to the problems of image denoising and filling in of missing pixels. We demonstrate that the results obtained by applying the learned bases to these problems are improved over those obtained with traditional techniques.


Neural Computation | 1994

Bayesian modeling and classification of neural signals

Michael S. Lewicki

Identifying and classifying action potential shapes in extracellular neural waveforms have long been the subject of research, and although several algorithms for this purpose have been successfully applied, their use has been limited by some outstanding problems. The first is how to determine shapes of the action potentials in the waveform and, second, how to decide how many shapes are distinct. A harder problem is that action potentials frequently overlap making difficult both the determination of the shapes and the classification of the spikes. In this report, a solution to each of these problems is obtained by applying Bayesian probability theory. By defining a probabilistic model of the waveform, the probability of both the form and number of spike shapes can be quantified. In addition, this framework is used to obtain an efficient algorithm for the decomposition of arbitrarily complex overlap sequences. This algorithm can extract many times more information than previous methods and facilitates the extracellular investigation of neuronal classes and of interactions within neuronal circuits.


IEEE Transactions on Pattern Analysis and Machine Intelligence | 2000

ICA mixture models for unsupervised classification of non-Gaussian classes and automatic context switching in blind signal separation

Te-Won Lee; Michael S. Lewicki; Terrence J. Sejnowski

An unsupervised classification algorithm is derived by modeling observed data as a mixture of several mutually exclusive classes that are each described by linear combinations of independent, non-Gaussian densities. The algorithm estimates the density of each class and is able to model class distributions with non-Gaussian structure. The new algorithm can improve classification accuracy compared with standard Gaussian mixture models. When applied to blind source separation in nonstationary environments, the method can switch automatically between classes, which correspond to contexts with different mixing properties. The algorithm can learn efficient codes for images containing both natural scenes and text. This method shows promise for modeling non-Gaussian structure in high-dimensional data and has many potential applications.


Nature | 2009

Emergence of complex cell properties by learning to generalize in natural scenes

Yan Karklin; Michael S. Lewicki

A fundamental function of the visual system is to encode the building blocks of natural scenes—edges, textures and shapes—that subserve visual tasks such as object recognition and scene understanding. Essential to this process is the formation of abstract representations that generalize from specific instances of visual input. A common view holds that neurons in the early visual system signal conjunctions of image features, but how these produce invariant representations is poorly understood. Here we propose that to generalize over similar images, higher-level visual neurons encode statistical variations that characterize local image regions. We present a model in which neural activity encodes the probability distribution most consistent with a given image. Trained on natural images, the model generalizes by learning a compact set of dictionary elements for image distributions typically encountered in natural scenes. Model neurons show a diverse range of properties observed in cortical cells. These results provide a new functional explanation for nonlinear effects in complex cells and offer insight into coding strategies in primary visual cortex (V1) and higher visual areas.


Neural Computation | 2005

A Hierarchical Bayesian Model for Learning Nonlinear Statistical Regularities in Nonstationary Natural Signals

Yan Karklin; Michael S. Lewicki

Capturing statistical regularities in complex, high-dimensional data is an important problem in machine learning and signal processing. Models such as principal component analysis (PCA) and independent component analysis (ICA) make few assumptions about the structure in the data and have good scaling properties, but they are limited to representing linear statistical regularities and assume that the distribution of the data is stationary. For many natural, complex signals, the latent variables often exhibit residual dependencies as well as nonstationary statistics. Here we present a hierarchical Bayesian model that is able to capture higher-order nonlinear structure and represent nonstationary data distributions. The model is a generalization of ICA in which the basis function coefficients are no longer assumed to be independent; instead, the dependencies in their magnitudes are captured by a set of density components. Each density component describes a common pattern of deviation from the marginal density of the pattern ensemble; in different combinations, they can describe nonstationary distributions. Adapting the model to image or audio data yields a nonlinear, distributed code for higher-order statistical regularities that reflect more abstract, invariant properties of the signal.

Collaboration


Dive into the Michael S. Lewicki's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Terrence J. Sejnowski

Salk Institute for Biological Studies

View shared research outputs
Top Co-Authors

Avatar

Yan Karklin

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar

Eizaburo Doi

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar

Evan Smith

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Phil Sallee

University of California

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Sofia Cavaco

Carnegie Mellon University

View shared research outputs
Researchain Logo
Decentralizing Knowledge