Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Cengiz Pehlevan is active.

Publication


Featured researches published by Cengiz Pehlevan.


Neuron | 2013

The Basal Ganglia Is Necessary for Learning Spectral, but Not Temporal, Features of Birdsong

Farhan Ali; Timothy M. Otchy; Cengiz Pehlevan; Antoniu L. Fantana; Yoram Burak; Bence P. Ölveczky

Executing a motor skill requires the brain to control which muscles to activate at what times. How these aspects of control-motor implementation and timing-are acquired, and whether the learning processes underlying them differ, is not well understood. To address this, we used a reinforcement learning paradigm to independently manipulate both spectral and temporal features of birdsong, a complex learned motor sequence, while recording and perturbing activity in underlying circuits. Our results uncovered a striking dissociation in how neural circuits underlie learning in the two domains. The basal ganglia was required for modifying spectral, but not temporal, structure. This functional dissociation extended to the descending motor pathway, where recordings from a premotor cortex analog nucleus reflected changes to temporal, but not spectral, structure. Our results reveal a strategy in which the nervous system employs different and largely independent circuits to learn distinct aspects of a motor skill.


Nuclear Physics | 2009

Complex Langevin equations and Schwinger–Dyson equations

Gerald S. Guralnik; Cengiz Pehlevan

Abstract Stationary distributions of complex Langevin equations are shown to be the complexified path integral solutions of the Schwinger–Dyson equations of the associated quantum field theory. Specific examples in zero dimensions and on a lattice are given. The relevance to the study of quantum field theory solution space is discussed.


Neural Computation | 2015

A hebbian/anti-hebbian neural network for linear subspace learning: A derivation from multidimensional scaling of streaming data

Cengiz Pehlevan; Tao Hu; Dmitri B. Chklovskii

Neural network models of early sensory processing typically reduce the dimensionality of streaming input data. Such networks learn the principal subspace, in the sense of principal component analysis, by adjusting synaptic weights according to activity-dependent learning rules. When derived from a principled cost function, these rules are nonlocal and hence biologically implausible. At the same time, biologically plausible local rules have been postulated rather than derived from a principled cost function. Here, to bridge this gap, we derive a biologically plausible network for subspace learning on streaming data by minimizing a principled cost function. In a departure from previous work, where cost was quantified by the representation, or reconstruction, error, we adopt a multidimensional scaling cost function for streaming data. The resulting algorithm relies only on biologically plausible Hebbian and anti-Hebbian local learning rules. In a stochastic setting, synaptic weights converge to a stationary state, which projects the input data onto the principal subspace. If the data are generated by a nonstationary distribution, the network can track the principal subspace. Thus, our result makes a step toward an algorithmic theory of neural computation.


PLOS ONE | 2014

Selectivity and Sparseness in Randomly Connected Balanced Networks

Cengiz Pehlevan; Haim Sompolinsky

Neurons in sensory cortex show stimulus selectivity and sparse population response, even in cases where no strong functionally specific structure in connectivity can be detected. This raises the question whether selectivity and sparseness can be generated and maintained in randomly connected networks. We consider a recurrent network of excitatory and inhibitory spiking neurons with random connectivity, driven by random projections from an input layer of stimulus selective neurons. In this architecture, the stimulus-to-stimulus and neuron-to-neuron modulation of total synaptic input is weak compared to the mean input. Surprisingly, we show that in the balanced state the network can still support high stimulus selectivity and sparse population response. In the balanced state, strong synapses amplify the variation in synaptic input and recurrent inhibition cancels the mean. Functional specificity in connectivity emerges due to the inhomogeneity caused by the generative statistical rule used to build the network. We further elucidate the mechanism behind and evaluate the effects of model parameters on population sparseness and stimulus selectivity. Network response to mixtures of stimuli is investigated. It is shown that a balanced state with unselective inhibition can be achieved with densely connected input to inhibitory population. Balanced networks exhibit the “paradoxical” effect: an increase in excitatory drive to inhibition leads to decreased inhibitory population firing rate. We compare and contrast selectivity and sparseness generated by the balanced network to randomly connected unbalanced networks. Finally, we discuss our results in light of experiments.


Nuclear Physics | 2009

Effective potential for complex Langevin equations

Gerald S. Guralnik; Cengiz Pehlevan

Abstract We construct an effective potential for the complex Langevin equation on a lattice. We show that the minimum of this effective potential gives the space–time and Langevin time average of the complex Langevin field. The loop expansion of the effective potential is matched with the derivative expansion of the associated Schwinger–Dyson equation to predict the stationary distribution to which the complex Langevin equation converges.


asilomar conference on signals, systems and computers | 2014

A Hebbian/Anti-Hebbian network for online sparse dictionary learning derived from symmetric matrix factorization

Tao Hu; Cengiz Pehlevan; Dmitri B. Chklovskii

Olshausen and Field (OF) proposed that neural computations in the primary visual cortex (V1) can be partially modelled by sparse dictionary learning. By minimizing the regularized representation error they derived an online algorithm, which learns Gabor-filter receptive fields from a natural image ensemble in agreement with physiological experiments. Whereas the OF algorithm can be mapped onto the dynamics and synaptic plasticity in a single-layer neural network, the derived learning rule is nonlocal - the synaptic weight update depends on the activity of neurons other than just pre- and postsynaptic ones - and hence biologically implausible. Here, to overcome this problem, we derive sparse dictionary learning from a novel cost-function - a regularized error of the symmetric factorization of the inputs similarity matrix. Our algorithm maps onto a neural network of the same architecture as OF but using only biologically plausible local learning rules. When trained on natural images our network learns Gabor-filter receptive fields and reproduces the correlation among synaptic weights hard-wired in the OF network. Therefore, online symmetric matrix factorization may serve as an algorithmic theory of neural computation.


asilomar conference on signals, systems and computers | 2014

A Hebbian/Anti-Hebbian network derived from online non-negative matrix factorization can cluster and discover sparse features

Cengiz Pehlevan; Dmitri B. Chklovskii

Despite our extensive knowledge of biophysical properties of neurons, there is no commonly accepted algorithmic theory of neuronal function. Here we explore the hypothesis that single-layer neuronal networks perform online symmetric nonnegative matrix factorization (SNMF) of the similarity matrix of the streamed data. By starting with the SNMF cost function we derive an online algorithm, which can be implemented by a biologically plausible network with local learning rules. We demonstrate that such network performs soft clustering of the data as well as sparse feature discovery. The derived algorithm replicates many known aspects of sensory anatomy and biophysical properties of neurons including unipolar nature of neuronal activity and synaptic weights, local synaptic plasticity rules and the dependence of learning rate on cumulative neuronal activity. Thus, we make a step towards an algorithmic theory of neuronal function, which should facilitate large-scale neural circuit simulations and biologically inspired artificial intelligence.


asilomar conference on signals, systems and computers | 2013

A neuron as a signal processing device

Tao Hu; Zaid J. Towfic; Cengiz Pehlevan; Alex V. Genkin; Dmitri B. Chklovskii

A neuron is a basic physiological and computational unit of the brain. While much is known about the physiological properties of a neuron, its computational role is poorly understood. Here we propose to view a neuron as a signal processing device that represents the incoming streaming data matrix as a sparse vector of synaptic weights scaled by an outgoing sparse activity vector. Formally, a neuron minimizes a cost function comprising a cumulative squared representation error and regularization terms. We derive an online algorithm that minimizes such cost function by alternating between the minimization with respect to activity and with respect to synaptic weights. The steps of this algorithm reproduce well-known physiological properties of a neuron, such as weighted summation and leaky integration of synaptic inputs, as well as an Oja-like, but parameter-free, synaptic learning rule. Our theoretical framework makes several predictions, some of which can be verified by the existing data, others require further experiments. Such framework should allow modeling the function of neuronal circuits without necessarily measuring all the microscopic biophysical parameters, as well as facilitate the design of neuromorphic electronics.


allerton conference on communication, control, and computing | 2015

Optimization theory of Hebbian/anti-Hebbian networks for PCA and whitening

Cengiz Pehlevan; Dmitri B. Chklovskii

In analyzing information streamed by sensory organs, our brains face challenges similar to those solved in statistical signal processing. This suggests that biologically plausible implementations of online signal processing algorithms may model neural computation. Here, we focus on such workhorses of signal processing as Principal Component Analysis (PCA) and whitening which maximize information transmission in the presence of noise. We adopt the similarity matching framework, recently developed for principal subspace extraction, but modify the existing objective functions by adding a decorrelating term. From the modified objective functions, we derive online PCA and whitening algorithms which are implementable by neural networks with local learning rules, i.e. synaptic weight updates that depend on the activity of only pre- and postsynaptic neurons. Our theory offers a principled model of neural computations and makes testable predictions such as the dropout of underutilized neurons.


Nature Communications | 2018

Flexibility in motor timing constrains the topology and dynamics of pattern generator circuits

Cengiz Pehlevan; Farhan Ali; Bence P. Ölveczky

Temporally precise movement patterns underlie many motor skills and innate actions, yet the flexibility with which the timing of such stereotyped behaviors can be modified is poorly understood. To probe this, we induce adaptive changes to the temporal structure of birdsong. We find that the duration of specific song segments can be modified without affecting the timing in other parts of the song. We derive formal prescriptions for how neural networks can implement such flexible motor timing. We find that randomly connected recurrent networks, a common approximation for how neocortex is wired, do not generally conform to these, though certain implementations can approximate them. We show that feedforward networks, by virtue of their one-to-one mapping between network activity and time, are better suited. Our study provides general prescriptions for pattern generator networks that implement flexible motor timing, an important aspect of many motor skills, including birdsong and human speech.Human speech and bird song requires the generation of precisely timed motor patterns. The authors show that zebra finches can learn to independently modify the duration of individual song segments and find that synfire chain networks are ideally suited to implement such flexible motor timing.

Collaboration


Dive into the Cengiz Pehlevan's collaboration.

Top Co-Authors

Avatar

Dmitri B. Chklovskii

Cold Spring Harbor Laboratory

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Tao Hu

University of Minnesota

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge