Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Misha Tsodyks is active.

Publication


Featured researches published by Misha Tsodyks.


Nature | 2003

Spontaneously emerging cortical representations of visual attributes

Tal Kenet; Dmitri Bibitchkov; Misha Tsodyks; Amiram Grinvald; Amos Arieli

Spontaneous cortical activity—ongoing activity in the absence of intentional sensory input—has been studied extensively, using methods ranging from EEG (electroencephalography), through voltage sensitive dye imaging, down to recordings from single neurons. Ongoing cortical activity has been shown to play a critical role in development, and must also be essential for processing sensory perception, because it modulates stimulus-evoked activity, and is correlated with behaviour. Yet its role in the processing of external information and its relationship to internal representations of sensory attributes remains unknown. Using voltage sensitive dye imaging, we previously established a close link between ongoing activity in the visual cortex of anaesthetized cats and the spontaneous firing of a single neuron. Here we report that such activity encompasses a set of dynamically switching cortical states, many of which correspond closely to orientation maps. When such an orientation state emerged spontaneously, it spanned several hypercolumns and was often followed by a state corresponding to a proximal orientation. We suggest that dynamically switching cortical states could represent the brains internal context, and therefore reflect or influence memory, perception and behaviour.


Neural Computation | 1998

Neural networks with dynamic synapses

Misha Tsodyks; Klaus Pawelzik; Henry Markram

Transmission across neocortical synapses depends on the frequency of presynaptic activity (Thomson & Deuchars, 1994). Interpyramidal synapses in layer V exhibit fast depression of synaptic transmission, while other types of synapses exhibit facilitation of transmission. To study the role of dynamic synapses in network computation, we propose a unified phenomenological model that allows computation of the postsynaptic current generated by both types of synapses when driven by an arbitrary pattern of action potential (AP) activity in a presynaptic population. Using this formalism, we analyze different regimes of synaptic transmission and demonstrate that dynamic synapses transmit different aspects of the presynaptic activity depending on the average presynaptic frequency. The model also allows for derivation of mean-field equations, which govern the activity of large, interconnected networks. We show that the dynamics of synaptic transmission results in complex sets of regular and irregular regimes of network activity.


Science | 2008

Synaptic Theory of Working Memory

Gianluigi Mongillo; Omri Barak; Misha Tsodyks

It is usually assumed that enhanced spiking activity in the form of persistent reverberation for several seconds is the neural correlate of working memory. Here, we propose that working memory is sustained by calcium-mediated synaptic facilitation in the recurrent connections of neocortical networks. In this account, the presynaptic residual calcium is used as a buffer that is loaded, refreshed, and read out by spiking activity. Because of the long time constants of calcium kinetics, the refresh rate can be low, resulting in a mechanism that is metabolically efficient and robust. The duration and stability of working memory can be regulated by modulating the spontaneous activity in the network.


Network: Computation In Neural Systems | 1991

Quantitative study of attractor neural network retrieving at low spike rates: I. substrate—spikes, rates and neuronal gain

Daniel J. Amit; Misha Tsodyks

We discuss the conversion of the description of the dynamics of a neural network from a temporal variation of synaptic currents driven by point spikes and modulated by a synaptic structure to a description of the current dynamics driven by spike rates. The conditions for the validity of such a conversion are discussed in detail and are shown to be quite realistic in cortical conditions. This is done in preparation for a discussion of a scenario of an attractor neural network, based on the interaction of synaptic currents and neural spike rates.The spike rates are then expressed in terms of the currents themselves to provide a closed set of dynamical equations for the currents. The current-rate relation is expressed as a neuronal gain function, converting currents into spike rates. It describes an integrate-and-fire element with noisy inputs, under explicit quaniitatve conditions which we argue to be plausible in a cortical situation In particular, it is shown that the gain of the current to rate transduct...


Neural Computation | 2001

An Algorithm for Modifying Neurotransmitter Release Probability Based on Pre- and Postsynaptic Spike Timing

Walter Senn; Henry Markram; Misha Tsodyks

The precise times of occurrence of individual pre- and postsynaptic action potentials are known to play a key role in the modification of synaptic efficacy. Based on stimulation protocols of two synaptically connected neurons, we infer an algorithm that reproduces the experimental data by modifying the probability of vesicle discharge as a function of the relative timing of spikes in the pre- and postsynaptic neurons. The primary feature of this algorithm is an asymmetry with respect to the direction of synaptic modification depending on whether the presynaptic spikes precede or follow the postsynaptic spike. Specifically, if the presynaptic spike occurs up to 50 ms before the postsynaptic spike, the probability of vesicle discharge is upregulated, while the probability of vesicle discharge is downregulated if the presynaptic spike occurs up to 50 ms after the postsynaptic spike. When neurons fire irregularly with Poisson spike trains at constant mean firing rates, the probability of vesicle discharge converges toward a characteristic value determined by the preand postsynaptic firing rates. On the other hand, if the mean rates of the Poisson spike trains slowly change with time, our algorithm predicts modifications in the probability of release that generalize Hebbian and Bienenstock-Cooper-Munro rules. We conclude that the proposed spike- based synaptic learning algorithm provides a general framework for regulating neurotransmitter release probability.


PLOS Computational Biology | 2005

The Emergence of Up and Down States in Cortical Networks

David Holcman; Misha Tsodyks

The cerebral cortex is continuously active in the absence of external stimuli. An example of this spontaneous activity is the voltage transition between an Up and a Down state, observed simultaneously at individual neurons. Since this phenomenon could be of critical importance for working memory and attention, its explanation could reveal some fundamental properties of cortical organization. To identify a possible scenario for the dynamics of Up–Down states, we analyze a reduced stochastic dynamical system that models an interconnected network of excitatory neurons with activity-dependent synaptic depression. The model reveals that when the total synaptic connection strength exceeds a certain threshold, the phase space of the dynamical system contains two attractors, interpreted as Up and Down states. In that case, synaptic noise causes transitions between the states. Moreover, an external stimulation producing a depolarization increases the time spent in the Up state, as observed experimentally. We therefore propose that the existence of Up–Down states is a fundamental and inherent property of a noisy neural ensemble with sufficiently strong synaptic connections.


Network: Computation In Neural Systems | 1995

RAPID STATE SWITCHING IN BALANCED CORTICAL NETWORK MODELS

Misha Tsodyks; Terrence J. Sejnowski

We have explored a network model of cortical microcircuits based on integrate-and-fire neurons in a regime where the reset following a spike is small, recurrent excitation is balanced by feedback inhibition, and the activity is highly irregular. This regime cannot be described by a mean-field theory based on average activity levels because essential features of the model depend on fluctuations from the average. We propose a new way of scaling the strength of synaptic interaction with the size of the network: rather than scale the amplitude of the synapse we scale the neurotransmitter release probabilities with the number of inputs to keep the average input constant. This is consistent with the low transmitter release probability observed in a majority of hippocampal synapses. Another prominent feature of this regime is the ability of the network to switch rapidly between different states, as demonstrated in a model based on an orientation columns in the mammalian visual cortex. Both network and intrinsic ...


Network: Computation In Neural Systems | 1991

Quantitative study of attractor neural networks retrieving at low spike rates: II. Low-rate retrieval in symmetric networks

Daniel J. Amit; Misha Tsodyks

A network of current-rate dynamics with a symmetric synaptic matrix is analysed and simulated for its low-rate attractor structure. The dynamics is deterministic, with the noise included in a realistic current-rate transduction function (discussed in part I)The analysis is carried out in mean-field theory. It is shown that at low loading the network retrieves without errors, with uniform low rates, that there are no simple spurious states and that the low-rate attractors, retrieving single patterns, are stable to the admixture of additional patterns. The analysis of the attractors in a network with an extensive number of patterns is carried out in the replica symmetric approximations. The results for the dependence of the retrieval rates on loading level; for the distribution of rates among neurons, as well as for the storage capacity are compared with simulations. Simulations also show that retriewal performance is very robust to random elimination of synapses. Moreover, errors in the stimulus, relative ...


Nature | 2002

Context-enabled learning in the human visual system

Yael Adini; Dov Sagi; Misha Tsodyks

Training was found to improve the performance of humans on a variety of visual perceptual tasks. However, the ability to detect small changes in the contrast of simple visual stimuli could not be improved by repetition. Here we show that the performance of this basic task could be modified after the discrimination of the stimulus contrast was practised in the presence of similar laterally placed stimuli, suggesting a change in the local neuronal circuit involved in the task. On the basis of a combination of hebbian and anti-hebbian synaptic learning rules compatible with our results, we propose a mechanism of plasticity in the visual cortex that is enabled by a change in the context.


Nature | 2004

Neural networks and perceptual learning

Misha Tsodyks; Charles D. Gilbert

Sensory perception is a learned trait. The brain strategies we use to perceive the world are constantly modified by experience. With practice, we subconsciously become better at identifying familiar objects or distinguishing fine details in our environment. Current theoretical models simulate some properties of perceptual learning, but neglect the underlying cortical circuits. Future neural network models must incorporate the top-down alteration of cortical function by expectation or perceptual tasks. These newly found dynamic processes are challenging earlier views of static and feedforward processing of sensory information.

Collaboration


Dive into the Misha Tsodyks's collaboration.

Top Co-Authors

Avatar

Henry Markram

École Polytechnique Fédérale de Lausanne

View shared research outputs
Top Co-Authors

Avatar

Dov Sagi

Weizmann Institute of Science

View shared research outputs
Top Co-Authors

Avatar

Mikhail Katkov

Weizmann Institute of Science

View shared research outputs
Top Co-Authors

Avatar

Sandro Romani

Howard Hughes Medical Institute

View shared research outputs
Top Co-Authors

Avatar

Terrence J. Sejnowski

Salk Institute for Biological Studies

View shared research outputs
Top Co-Authors

Avatar

Amos Arieli

Weizmann Institute of Science

View shared research outputs
Top Co-Authors

Avatar

Omri Barak

Technion – Israel Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Yael Adini

Weizmann Institute of Science

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Daniel J. Amit

Hebrew University of Jerusalem

View shared research outputs
Researchain Logo
Decentralizing Knowledge