Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Davide Badoni is active.

Publication


Featured researches published by Davide Badoni.


Neural Computation | 2000

Spike-Driven Synaptic Plasticity: Theory, Simulation, VLSI Implementation

Stefano Fusi; Mario Annunziato; Davide Badoni; Andrea Salamon; Daniel J. Amit

We present a model for spike-driven dynamics of a plastic synapse, suited for a VLSI implementation. The synaptic device behaves as a capacitor on short timescales and preserves the memory of two stable states (efficacies) on long timescales. The transitions (LTP/LTD) are stochastic because both the number and the distribution of neural spikes in any finite (stimulation) interval fluctuate, even at fixed pre- and postsynaptic spike rates. The dynamics of the single synapse is studied analytically by extending the solution to a classic problem in queuing theory (Takcs process). The model of the synapse is implemented in a VLSI and consists of only 18 transistors. It is also directly simulated. The simulations indicate that LTP/LTD probabilities versus rates are robust to fluctuations of the electronic parameters in a wide range of rates. The solutions for these probabilities are in very good agreement with both the simulations and measurements. Moreover, the probabilities are readily manipulable by variations of the chips parameters, even in ranges where they are very small. The tests of the electronic device cover the range from spontaneous activity (34 Hz) to stimulus-driven rates (50 Hz). Low transition probabilities can be maintained in all ranges, even though the intrinsic time constants of the device are short ( 100 ms). Synaptic transitions are triggered by elevated presynaptic rates: for low presynaptic rates, there are essentially no transitions. The synaptic device can preserve its memory for years in the absence of stimulation. Stochasticity of learning is a result of the variability of interspike intervals; noise is a feature of the distributed dynamics of the network. The fact that the synapse is binary on long timescales solves the stability problem of synaptic efficacies in the absence of stimulation. Yet stochastic learning theory ensures that it does not affect the collective behavior of the network, if the transition probabilities are low and LTP is balanced against LTD.


IEEE Transactions on Neural Networks | 2003

A VLSI recurrent network of integrate-and-fire neurons connected by plastic synapses with long-term memory

Elisabetta Chicca; Davide Badoni; V. Dante; M. D'Andreagiovanni; G. Salina; L. Carota; Stefano Fusi; P. Del Giudice

Electronic neuromorphic devices with on-chip, on-line learning should be able to modify quickly the synaptic couplings to acquire information about new patterns to be stored (synaptic plasticity) and, at the same time, preserve this information on very long time scales (synaptic stability). Here, we illustrate the electronic implementation of a simple solution to this stability-plasticity problem, recently proposed and studied in various contexts. It is based on the observation that reducing the analog depth of the synapses to the extreme (bistable synapses) does not necessarily disrupt the performance of the device as an associative memory, provided that 1) the number of neurons is large enough; 2) the transitions between stable synaptic states are stochastic; and 3) learning is slow. The drastic reduction of the analog depth of the synaptic variable also makes this solution appealing from the point of view of electronic implementation and offers a simple methodological alternative to the technological solution based on floating gates. We describe the full custom analog very large-scale integration (VLSI) realization of a small network of integrate-and-fire neurons connected by bistable deterministic plastic synapses which can implement the idea of stochastic learning. In the absence of stimuli, the memory is preserved indefinitely. During the stimulation the synapse undergoes quick temporary changes through the activities of the pre- and postsynaptic neurons; those changes stochastically result in a long-term modification of the synaptic efficacy. The intentionally disordered pattern of connectivity allows the system to generate a randomness suited to drive the stochastic selection mechanism. We check by a suitable stimulation protocol that the stochastic synaptic plasticity produces the expected pattern of potentiation and depression in the electronic network.


international symposium on circuits and systems | 2006

An aVLSI recurrent network of spiking neurons with reconfigurable and plastic synapses

Davide Badoni; Massimiliano Giulioni; V. Dante; P. Del Giudice

We illustrate key features of an analog, VLSI (aVLSI) chip implementing a network composed of 32 integrate-and-fire (IF) neurons with firing rate adaptation (AHP current), endowed with both a recurrent synaptic connectivity and AER-based connectivity with external, AER-compliant devices. Synaptic connectivity can be reconfigured at will as for the presence/absence of each synaptic contact and the excitatory/inhibitory nature of each synapse. Excitatory synapses are plastic through a spike-driven stochastic, Hebbian mechanism, and possess a self-limiting mechanism aiming at an optimal use of synaptic resources for Hebbian learning


Network: Computation In Neural Systems | 1995

Electronic implementation of an analogue attractor neural network with stochastic learning

Davide Badoni; Stefano Bertazzoni; Stefano Buglioni; Gaetano Salina; Daniel J. Amit; Stefano Fusi

We describe and discuss an electronic implementation of an attractor neural network with plastic synapses. The network undergoes double dynamics, for the neurons as well as the synapses. Both dynamical processes are unsupervised. The synaptic dynamics is autonomous, in that it is driven exclusively and perpetually by neural activities. The latter follow the network activity via the developing synapses and the influence of external stimuli. Such a network self-organizes and is a device which converts the gross statistical characteristics of the stimulus input stream into a set of attractors (reverberations). To maintain for long time the acquired memory, the analog synaptic efficacies are discretized by a stochastic refresh mechanism. The discretized synaptic memory has indefinitely long lifetime in the absence of activity in the network. It is modified only by the arrival of new stimuli. The stochastic refresh mechanism produces transitions at low probability which ensures that transient stimuli do not cr...


international conference on electronics, circuits, and systems | 2008

A VLSI network of spiking neurons with plastic fully configurable “stop-learning” synapses

Massimiliano Giulioni; Patrick Camilleri; V. Dante; Davide Badoni; Giacomo Indiveri; Jochen Braun; P. del Giudice

We describe and demonstrate a neuromorphic, analog VLSI chip (termed F-LANN) hosting 128 integrate-and-fire (IF) neurons with spike-frequency adaptation, and 16,384 plastic bistable synapses implementing a self-regulated form of Hebbian, spike-driven, stochastic plasticity. The chip is designed to offer a high degree of reconfigurability: each synapse may be individually configured at any time to be either excitatory or inhibitory and to receive either recurrent input from an on-chip neuron or AER-based input from an off-chip neuron. The initial state of each synapse can be set as potentiated or depressed, and the state of each synapse can be read and stored on a computer.


Neural Computation | 2009

Classification of correlated patterns with a configurable analog vlsi neural network of spiking neurons and self-regulating plastic synapses

Massimiliano Giulioni; Mario Pannunzi; Davide Badoni; Vittorio Dante; Paolo Del Giudice

We describe the implementation and illustrate the learning performance of an analog VLSI network of 32 integrate-and-fire neurons with spike-frequency adaptation and 2016 Hebbian bistable spike-driven stochastic synapses, endowed with a self-regulating plasticity mechanism, which avoids unnecessary synaptic changes. The synaptic matrix can be flexibly configured and provides both recurrent and external connectivity with address-event representation compliant devices. We demonstrate a marked improvement in the efficiency of the network in classifying correlated patterns, owing to the self-regulating mechanism.


international conference hybrid intelligent systems | 2007

A Neuromorphic aVLSI network chip with configurable plastic synapses

Patrick Camilleri; Massimiliano Giulioni; V. Dante; Davide Badoni; Giacomo Indiveri; B. Michaelis; Jochen Braun; P. del Giudice

We describe and demonstrate the key features of a neuromorphic, analog VLSI chip (termed F-LANN) hosting 128 integrate-and-fire (IF) neurons with spike-frequency adaptation, and 16 384 plastic bistable synapses implementing a self-regulated form of Hebbian, spike-driven, stochastic plasticity. We were successfully able to test and verify the basic operation of the chip as well as its main new feature, namely the synaptic configurability. This configurability enables us to configure each individual synapse as either excitatory or inhibitory and to receive either recurrent input from an on-chip neuron or AER (address event representation)-based input from an off-chip neuron. Its also possible to set the initial state of each synapse as potentiated or depressed, and the state of each synapse can be read and stored on a computer. The main aim of this chip is to be able to efficiently perform associative learning experiments on a large number of synapses. In the future we would like to connect up multiple F-LANN chips together to be able to perform associative learning of natural stimulus sets.


Archive | 1998

Analog VLSI implementation of a spike driven stochastic dynamical synapse

Mario Annunziato; Davide Badoni; Stefano Fusi; Andrea Salamon

We have undertaken to implement in analog electronics a neural network device which autonomously learns from its experience in real time. Implementing a large neural network that has this capability, implies analog VLSI technology and on-chip learning. This means designing a plastic synaptic connection that 1. is simple (low number of transistors and reduced silicon area), 2. has low power consumption and 3. preserves memory on long time scales and, at the same time, can be modified in short time intervals during stimulation.


SPIE's 1995 Symposium on OE/Aerospace Sensing and Dual Use Photonics | 1995

LANN27: an electronic implementation of an analog attractor neural network with stochastic learning

Davide Badoni; Stefano Bertazzoni; Stefano Buglioni; Gaetano Salina; Stefano Fusi; Daniel J. Amit

We describe and discuss an electronic implementation of an attractor neural network with plastic synapses. The synaptic dynamics are unsupervised and autonomous, in that they are driven exclusively and perpetually by neural activities. The latter follow the network activity via the developing synapses and the influence of external stimuli. Such a network self- organizes and is a device which converts the gross statistical characteristics of the stimulus input stream into a set of attractors (reverberations). To maintain for a long time the acquired memory the analog synaptic efficacies are discretized by a stochastic refresh mechanism. The discretized synaptic memory has indefinitely long life time in the absence of activity in the network. It is modified only by the arrival of new stimuli. The stochastic refresh mechanism produces transitions at low probability which ensures that transient stimuli do not create significant modifications and that the system has large palimpsestic memory. The electronic implementation is completely analog, stochastic and asynchronous. The circuitry of the first prototype is discussed in some detail as well as the tests performed on it. In carrying out the implementation we have been guided by biological considerations and by electronic constraints.


Nuclear Instruments & Methods in Physics Research Section A-accelerators Spectrometers Detectors and Associated Equipment | 2010

Gamma–gamma tagging system for KLOE2 experiment

Flavio Archilli; Danilo Babusci; Davide Badoni; Matteo Beretta; Francesco Gonnella; Lorenzo Iafolla; Roberto Messi; Dario Moricciani; Lina Quintieri

Collaboration


Dive into the Davide Badoni's collaboration.

Top Co-Authors

Avatar

Roberto Messi

University of Rome Tor Vergata

View shared research outputs
Top Co-Authors

Avatar

V. Dante

Istituto Superiore di Sanità

View shared research outputs
Top Co-Authors

Avatar

Lorenzo Iafolla

University of Rome Tor Vergata

View shared research outputs
Top Co-Authors

Avatar

Massimiliano Giulioni

Istituto Superiore di Sanità

View shared research outputs
Top Co-Authors

Avatar

Daniel J. Amit

Hebrew University of Jerusalem

View shared research outputs
Top Co-Authors

Avatar

Andrea Salamon

Sapienza University of Rome

View shared research outputs
Top Co-Authors

Avatar

Gaetano Salina

Istituto Nazionale di Fisica Nucleare

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Stefano Fusi

Istituto Superiore di Sanità

View shared research outputs
Top Co-Authors

Avatar

Stefano Fusi

Istituto Superiore di Sanità

View shared research outputs
Researchain Logo
Decentralizing Knowledge