Conductance-based dendrites perform reliability-weighted opinion pooling
Jakob Jordan, João Sacramento, Mihai A. Petrovici, Walter Senn
CConductance-based dendrites perform reliability-weightedopinion pooling
Jakob Jordan, Mihai A. Petrovici, Walter Senn jordan,petrovici,[email protected] of BernBern, Switzerland
João Sacramento [email protected] / ETHZürich, Switzerland
ABSTRACT
Cue integration, the combination of different sources of informationto reduce uncertainty, is a fundamental computational principle ofbrain function. Starting from a normative model we show that thedynamics of multi-compartment neurons with conductance-baseddendrites naturally implement the required probabilistic compu-tations. The associated error-driven plasticity rule allows neuronsto learn the relative reliability of different pathways from datasamples, approximating Bayes-optimal observers in multisensoryintegration tasks. Additionally, the model provides a functionalinterpretation of neural recordings from multisensory integrationexperiments and makes specific predictions for membrane potentialand conductance dynamics of individual neurons.
CCS CONCEPTS • Computer systems organization → Neural networks ; •
Com-puting methodologies → Learning paradigms . KEYWORDS
Bayesian cue combination, multisensory integration, neural net-works, conductance-based coupling, synaptic plasticity
ACM Reference Format:
Jakob Jordan, Mihai A. Petrovici, Walter Senn and João Sacramento. 2020.Conductance-based dendrites perform reliability-weighted opinion pooling.In
Neuro-inspired Computational Elements Workshop (NICE ’20), March 17–20, 2020, Heidelberg, Germany.
ACM, New York, NY, USA, 3 pages. https://doi.org/10.1145/3381755.3381767
Animals need to operate successfully in their environment basedon sensory information and prior expectations that are both in-complete and uncertain. To overcome the limitations of individualinformation sources it is useful to combine them, for example sen-sory inputs with prior expectations, sensory inputs from differentmodalities or the information from different receptive fields. A prob-abilistic model of cue integration shows that combining multiplesources of information indeed reduces uncertainty. However, todo so successfully requires knowledge about the reliability of each
Permission to make digital or hard copies of part or all of this work for personal orclassroom use is granted without fee provided that copies are not made or distributedfor profit or commercial advantage and that copies bear this notice and the full citationon the first page. Copyrights for third-party components of this work must be honored.For all other uses, contact the owner/author(s).
NICE ’20, March 17–20, 2020, Heidelberg, Germany © 2020 Copyright held by the owner/author(s).ACM ISBN 978-1-4503-7718-8/20/03.https://doi.org/10.1145/3381755.3381767 source: the maximum-a-posteriori (MAP) estimate is a linear combi-nation of individual cues, each weighted with their relative reliabil-ity [9]. Behavioral evidence [4, 5, 12] demonstrates that humans andnon-human animals indeed are able to optimally integrate multisen-sory stimuli to improve their performance compared to unisensorytesting conditions. What kind of neural circuitry enables theseprobabilistic computations? We propose that multi-compartmentneuron models with conductance-based dendrites are naturallyequipped to learn the reliability of different sensory streams andto use this information to perform approximately optimal cue in-tegration. Neuron and synapse dynamics are jointly derived froman energy-minimization principle. The resulting neuron dynamicscoincide with standard leaky integrators with multiple dendriticcompartments. The associated plasticity rule is reminiscent of error-driven learning rules [20, 21], but contains an additional term tolearn the relative reliabilities of different pathways. To illustrate themodel, we train it on a multisensory integration task and demon-strate that it can approximate Bayes-optimal inference. Further-more, the dynamics of the trained model is in good agreement withexperimental findings and allows us to make specific predictionson membrane potentials and conductances in multisensory inte-gration experiments. Our model connects a normative approachto cue integration with circuit-level implementations, bridging thescales from behaviour to individual neurons.
We consider a probabilistic description of membrane potentialsin multi-compartment models. Individual dendritic compartmentsrepresent Gaussian densities via their membrane potential (mean)and total membrane conductance (precision) (Fig. 1 a , green andblue). The soma computes a product-of-experts model [7] of thedendritic distributions (Fig. 1 a , red). In the present cue integrationcontext we refer to the mean as "evidence" and the precision as"reliability". Membrane potentials and conductances are decoupled,i.e., one can vary the membrane potential independently of themembrane conductance and vice versa, by considering parallelprojections of each afferent via direct excitation and feedforwardinhibition [8]. Under the assumption of small dendritic capacitancesand strong coupling of the dendritic compartments to the soma,this leads to the following somatic membrane potential distributionfor a given weight matrix W and presynaptic activity r : p ( u | W , r ) = Z e − д s ( W , r ) ( u − ¯ u s ( W , r )) . (1)Here ¯ u s is a convex combination of leak potential and dendriticmembrane potentials weighted with their respective conductances a r X i v : . [ q - b i o . N C ] J un ICE ’20, March 17–20, 2020, Heidelberg, Germany Jakob Jordan, Mihai A. Petrovici, Walter Senn and João Sacramento b d ea evidencereliability c f e a t u r e d e t e c t o r s m u l t i s e n s o r y c e ll s t r u e o r i e n t a t i o n s t i m u l u s o r i e n t a t i o n Figure 1:
Probabilistic multisensory cue integration via conductance-based dendrites. (a) Proposed neuronal implementation. Each dendritic compartmentencodes a Gaussian density via the respective membrane potential u (evidence) and conductance д (reliability). While the membrane potential encodes the amountof evidence for the neurons’ preferred feature, the conductance encodes the reliability of the evidence. The somatic compartment represents a product model ofthe dendritic distributions. (b) Experimental setup [cf. 12]. Using visual and/or tactile information a rat estimates the orientation of a grating and classifies it aseither vertical or horizontal. (c) Network model. From a ground truth orientation θ ∗ visual and tactile stimuli are sampled with modality specific noise amplitudesand presented to two population of von-Mises feature detectors. All feature detectors project to two multisensory cells which are trained to respond with high/lowfiring rates to their preferred/anti-preferred orientations. (d) Trial-averaged loss of a Bayes-optimal MAP estimate (dark gray), an unweighted estimate combiningvisual and tactile orientations equally (light gray), the trained model with bimodal cues (red), only visual (blue) and only tactile cues (green). Light colored barsindicate loss before training, light gray line denotes chance level. (e) Somatic membrane potential dynamics generated by the subsequent presentation of threedifferent orientations. From anti-preferred ( ◦ ), over non-preferred ( ◦ ) to preferred orientation ( ◦ ). Fluctuations in the membrane potential reflect thereliability of the combined estimate encoded in the somatic conductance д s . and д s is a sum of leak and dendritic conductances. From eq. (1)we obtain neuron dynamics by requiring that the somatic poten-tial minimizes the energy E ( u , W , r ) : = − log p ( u | Wr ) via gradientdescent [15, 17, 18]: c m (cid:219) u s = д sL ( E L − u s ) + D (cid:213) y = д d y ( ¯ u d y − u s ) . (2)In the stationary state, the somatic potential is equal to a weightedcombination of the dendritic potentials, similar to the MAP es-timate in cue integration scenarios [9]. The biophysics of multi-compartment neuron models with conductance-based synapseshence naturally implement an important probabilistic computation.This makes our model particularly fitting to mixed-signal neuro-morphic systems featuring conductance-based interactions [19]. Aplasticity rule is obtained by requiring that the Kullback-Leibler di-vergence D KL between distribution of somatic potentials and sometarget distribution is minimized: ∆ w E/I ij = η (cid:20) ( u t , i − ¯ u s , i )( E E/I − ¯ u s , i ) − (cid:18) ( u t , i − ¯ u s , i ) − д s , i (cid:19)(cid:21) r j , (3)where u t i represents a cell-specific sample from the target distribu-tion that can be provided externally [20]. While the first part is astandard error-correcting term [20, 21], the second term arises dueto our probabilistic ansatz and performs reliability assignment tothe respective projections. To illustrate our model we apply it to a probabilistic multisen-sory integration task: the orientation (horizontal or vertical) of agrating is to be estimated from noisy visual (V) and tactile (T) infor-mation (fig. 1 b ; [12]). For simplicity each modality is representedby a homogeneous population of von-Mises feature detectors [6]projecting to two multisensory cells. From a ground truth orienta-tion θ ∗ , two modality specific orientations θ V , θ T are sampled with modality-specific noise amplitudes ( σ V << σ T ) and presented to therespective feature detectors (fig. 1 c ). The two outputs are trainedto signal whether the orientation is larger and smaller than 45 ◦ ,respectively. We compare the performance of the Bayes-optimalmaximum-a-posteriori estimate, the naive estimate that equallyweights visual and tactile stimuli, and the estimate obtained fromthe multisensory cells providing only visual/tactile, or both visualand tactile input (figure1 c ). While the naive and single modal es-timates perform significantly worse than the MAP estimate, themultisensory estimate achieves similar error levels demonstratingthat the network has successfully learned the relative reliability ofthe two information streams and makes use of them to integratevisual and tactile stimuli.Despite being trained explicitly only for this task, many aspectsof experimental observations are reproduced naturally by the result-ing network, e.g., tight coupling and stimulus-specific tuning of ex-citation and inhibition [8], stimulus-specific target potentials [3, 16],the stimulus-driven quenching of variability [2], sub/supra/linearmultisensory neural responses [14], the principle of inverse effec-tiveness [10], or reliability-dependent multisensory tuning [11]. Our model provides a parsimonious implementation ofBayes-optimal cue integration in single neurons by relying on thenatural dynamics of multi-compartment neurons with conductance-based dendrites. The associated plasticity rule allows circuits tolearn in a supervised setting not just to reduce output errors, but toalso assign the correct relative reliabilities to different informationstreams. Similar to previous models [13], a divisive normalizationoperation [1] is a critical component. The conductance-based na-ture of synaptic coupling hence may not be purely an artifact ofthe biological substrate, but rather enable single neurons to per-form important probabilistic computations previously thought tobe realized only at the circuit level [13]. In this view, the experi-mental observations in multisensory integration experiments aresignatures of ongoing probabilistic computations. onductance-based dendrites perform reliability-weighted opinion pooling NICE ’20, March 17–20, 2020, Heidelberg, Germany
ACKNOWLEDGMENTS
We gratefully acknowledge funding from the European Union, un-der grant agreements 604102, 720270, 785907 (HBP) and the ManfredStärk Foundation.
REFERENCES [1] Matteo Carandini and David J Heeger. 2012. Normalization as a canonical neuralcomputation.
Nature Reviews Neuroscience
13, 1 (2012), 51.[2] Mark M Churchland, M Yu Byron, John P Cunningham, Leo P Sugrue, Marlene RCohen, Greg S Corrado, William T Newsome, Andrew M Clark, Paymon Hosseini,Benjamin B Scott, et al. 2010. Stimulus onset quenches neural variability: awidespread cortical phenomenon.
Nature neuroscience
13, 3 (2010), 369.[3] Sylvain Crochet, James FA Poulet, Yves Kremer, and Carl CH Petersen. 2011.Synaptic mechanisms underlying sparse coding of active touch.
Neuron
69, 6(2011), 1160–1175.[4] Marc O Ernst and Martin S Banks. 2002. Humans integrate visual and hapticinformation in a statistically optimal fashion.
Nature
Journal of Neuroscience
29, 49 (2009), 15601–15612.[6] Andreas VM Herz, Alexander Mathis, and Martin Stemmler. 2017. Periodicpopulation codes: From a single circular variable to higher dimensions, multiplenested scales, and conceptual spaces.
Curr. Opin. Neurobiol.
46 (2017), 99–108.[7] Geoffrey E Hinton. 2002. Training products of experts by minimizing contrastivedivergence.
Neural computation
14, 8 (2002), 1771–1800.[8] Jeffry S Isaacson and Massimo Scanziani. 2011. How inhibition shapes corticalactivity.
Neuron
72, 2 (2011), 231–243.[9] David C Knill and Alexandre Pouget. 2004. The Bayesian brain: the role ofuncertainty in neural coding and computation.
TRENDS in Neurosciences
27, 12(2004), 712–719. [10] M Alex Meredith and Barry E Stein. 1983. Interactions among converging sensoryinputs in the superior colliculus.
Science
Neuron
Neuron
97, 3 (2018), 626–639.[13] Tomokazu Ohshiro, Dora E Angelaki, and Gregory C DeAngelis. 2011. A normal-ization model of multisensory integration.
Nat. Neurosci.
14, 6 (2011), 775.[14] Thomas J Perrault Jr, J William Vaughan, Barry E Stein, and Mark T Wallace. 2005.Superior colliculus neurons use distinct operational modes in the integration ofmultisensory stimuli.
Journal of neurophysiology
93, 5 (2005), 2575–2586.[15] Rajesh PN Rao and Dana H Ballard. 1999. Predictive coding in the visual cortex:a functional interpretation of some extra-classical receptive-field effects.
Natureneuroscience
2, 1 (1999).[16] Shankar Sachidhanandam, Varun Sreenivasan, Alexandros Kyriakatos, Yves Kre-mer, and Carl CH Petersen. 2013. Membrane potential correlates of sensoryperception in mouse barrel cortex.
Nat. Neurosci.
16, 11 (2013), 1671–1677.[17] João Sacramento, Rui Ponte Costa, Yoshua Bengio, and Walter Senn. 2018. Den-dritic cortical microcircuits approximate the backpropagation algorithm. In
Ad-vances in Neural Information Processing Systems . 8721–8732.[18] Benjamin Scellier and Yoshua Bengio. 2017. Equilibrium propagation: Bridgingthe gap between energy-based models and backpropagation.
Front. Comput.Neurosci.
11 (2017), 24.[19] Johannes Schemmel, Daniel Briiderle, Andreas Griibl, Matthias Hock, KarlheinzMeier, and Sebastian Millner. 2010. A wafer-scale neuromorphic hardware sys-tem for large-scale neural modeling. In
Proceedings of 2010 IEEE InternationalSymposium on Circuits and Systems . IEEE, 1947–1950.[20] Robert Urbanczik and Walter Senn. 2014. Learning by the dendritic prediction ofsomatic spiking.
Neuron
81, 3 (2014), 521–528.[21] Bernard Widrow and Marcian E Hoff. 1960.