Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Biswa Sengupta is active.

Publication


Featured researches published by Biswa Sengupta.


PLOS Computational Biology | 2010

Action potential energy efficiency varies among neuron types in vertebrates and invertebrates

Biswa Sengupta; Martin Stemmler; Simon B. Laughlin; Jeremy E. Niven

The initiation and propagation of action potentials (APs) places high demands on the energetic resources of neural tissue. Each AP forces ATP-driven ion pumps to work harder to restore the ionic concentration gradients, thus consuming more energy. Here, we ask whether the ionic currents underlying the AP can be predicted theoretically from the principle of minimum energy consumption. A long-held supposition that APs are energetically wasteful, based on theoretical analysis of the squid giant axon AP, has recently been overturned by studies that measured the currents contributing to the AP in several mammalian neurons. In the single compartment models studied here, AP energy consumption varies greatly among vertebrate and invertebrate neurons, with several mammalian neuron models using close to the capacitive minimum of energy needed. Strikingly, energy consumption can increase by more than ten-fold simply by changing the overlap of the Na+ and K+ currents during the AP without changing the APs shape. As a consequence, the height and width of the AP are poor predictors of energy consumption. In the Hodgkin–Huxley model of the squid axon, optimizing the kinetics or number of Na+ and K+ channels can whittle down the number of ATP molecules needed for each AP by a factor of four. In contrast to the squid AP, the temporal profile of the currents underlying APs of some mammalian neurons are nearly perfectly matched to the optimized properties of ionic conductances so as to minimize the ATP cost.


Journal of the Royal Society Interface | 2015

Knowing one's place: a free-energy approach to pattern regulation.

K. J. Friston; Michael Levin; Biswa Sengupta; Giovanni Pezzulo

Understanding how organisms establish their form during embryogenesis and regeneration represents a major knowledge gap in biological pattern formation. It has been recently suggested that morphogenesis could be understood in terms of cellular information processing and the ability of cell groups to model shape. Here, we offer a proof of principle that self-assembly is an emergent property of cells that share a common (genetic and epigenetic) model of organismal form. This behaviour is formulated in terms of variational free-energy minimization—of the sort that has been used to explain action and perception in neuroscience. In brief, casting the minimization of thermodynamic free energy in terms of variational free energy allows one to interpret (the dynamics of) a system as inferring the causes of its inputs—and acting to resolve uncertainty about those causes. This novel perspective on the coordination of migration and differentiation of cells suggests an interpretation of genetic codes as parametrizing a generative model—predicting the signals sensed by cells in the target morphology—and epigenetic processes as the subsequent inversion of that model. This theoretical formulation may complement bottom-up strategies—that currently focus on molecular pathways—with (constructivist) top-down approaches that have proved themselves in neuroscience and cybernetics.


PLOS Computational Biology | 2013

Information and efficiency in the nervous system-a synthesis

Biswa Sengupta; Martin Stemmler; K. J. Friston

In systems biology, questions concerning the molecular and cellular makeup of an organism are of utmost importance, especially when trying to understand how unreliable components—like genetic circuits, biochemical cascades, and ion channels, among others—enable reliable and adaptive behaviour. The repertoire and speed of biological computations are limited by thermodynamic or metabolic constraints: an example can be found in neurons, where fluctuations in biophysical states limit the information they can encode—with almost 20–60% of the total energy allocated for the brain used for signalling purposes, either via action potentials or by synaptic transmission. Here, we consider the imperatives for neurons to optimise computational and metabolic efficiency, wherein benefits and costs trade-off against each other in the context of self-organised and adaptive behaviour. In particular, we try to link information theoretic (variational) and thermodynamic (Helmholtz) free-energy formulations of neuronal processing and show how they are related in a fundamental way through a complexity minimisation lemma.


PLOS Biology | 2016

Towards a Neuronal Gauge Theory

Biswa Sengupta; Arturo Tozzi; Gerald K. Cooray; Pamela K. Douglas; K. J. Friston

In a published paper [10], we have proposed that the brain (and other self-organized biological and artificial systems) can be characterized via the mathematical apparatus of a gauge theory. The picture that emerges from this approach suggests that any biological system (from a neuron to an organism) can be cast as resolving uncertainty about its external milieu, either by changing its internal states or its relationship to the environment. Using formal arguments, we have shown that a gauge theory for neuronal dynamics – based on approximate Bayesian inference – has the potential to shed new light on phenomena that have thus far eluded a formal description, such as attention and the link between action and perception. Here, we describe the technical apparatus that enables such a variational inference on manifolds.Given the amount of knowledge and data accruing in the neurosciences, is it time to formulate a general principle for neuronal dynamics that holds at evolutionary, developmental, and perceptual timescales? In this paper, we propose that the brain (and other self-organised biological systems) can be characterised via the mathematical apparatus of a gauge theory. The picture that emerges from this approach suggests that any biological system (from a neuron to an organism) can be cast as resolving uncertainty about its external milieu, either by changing its internal states or its relationship to the environment. Using formal arguments, we show that a gauge theory for neuronal dynamics—based on approximate Bayesian inference—has the potential to shed new light on phenomena that have thus far eluded a formal description, such as attention and the link between action and perception.


PLOS Computational Biology | 2013

Balanced Excitatory and Inhibitory Synaptic Currents Promote Efficient Coding and Metabolic Efficiency

Biswa Sengupta; Simon B. Laughlin; Jeremy E. Niven

A balance between excitatory and inhibitory synaptic currents is thought to be important for several aspects of information processing in cortical neurons in vivo, including gain control, bandwidth and receptive field structure. These factors will affect the firing rate of cortical neurons and their reliability, with consequences for their information coding and energy consumption. Yet how balanced synaptic currents contribute to the coding efficiency and energy efficiency of cortical neurons remains unclear. We used single compartment computational models with stochastic voltage-gated ion channels to determine whether synaptic regimes that produce balanced excitatory and inhibitory currents have specific advantages over other input regimes. Specifically, we compared models with only excitatory synaptic inputs to those with equal excitatory and inhibitory conductances, and stronger inhibitory than excitatory conductances (i.e. approximately balanced synaptic currents). Using these models, we show that balanced synaptic currents evoke fewer spikes per second than excitatory inputs alone or equal excitatory and inhibitory conductances. However, spikes evoked by balanced synaptic inputs are more informative (bits/spike), so that spike trains evoked by all three regimes have similar information rates (bits/s). Consequently, because spikes dominate the energy consumption of our computational models, approximately balanced synaptic currents are also more energy efficient than other synaptic regimes. Thus, by producing fewer, more informative spikes approximately balanced synaptic currents in cortical neurons can promote both coding efficiency and energy efficiency.


Journal of Cerebral Blood Flow and Metabolism | 2013

The effect of cell size and channel density on neuronal information encoding and energy efficiency.

Biswa Sengupta; A. Aldo Faisal; Simon B. Laughlin; Jeremy E. Niven

Identifying the determinants of neuronal energy consumption and their relationship to information coding is critical to understanding neuronal function and evolution. Three of the main determinants are cell size, ion channel density, and stimulus statistics. Here we investigate their impact on neuronal energy consumption and information coding by comparing single-compartment spiking neuron models of different sizes with different densities of stochastic voltage-gated Na+ and K+ channels and different statistics of synaptic inputs. The largest compartments have the highest information rates but the lowest energy efficiency for a given voltage-gated ion channel density, and the highest signaling efficiency (bits spike −1) for a given firing rate. For a given cell size, our models revealed that the ion channel density that maximizes energy efficiency is lower than that maximizing information rate. Low rates of small synaptic inputs improve energy efficiency but the highest information rates occur with higher rates and larger inputs. These relationships produce a Law of Diminishing Returns that penalizes costly excess information coding capacity, promoting the reduction of cell size, channel density, and input stimuli to the minimum possible, suggesting that the trade-off between energy and information has influenced all aspects of neuronal anatomy and physiology.


Proceedings of the IEEE | 2014

Power Consumption During Neuronal Computation

Biswa Sengupta; Martin Stemmler

Maintaining the ability of the nervous system to perceive, remember, process, and react to the outside world requires a continuous energy supply. Yet the overall power consumption is remarkably low, which has inspired engineers to mimic nervous systems in designing artificial cochlea, retinal implants, and brain-computer interfaces (BCIs) to improve the quality of life in patients. Such neuromorphic devices are both energy efficient and increasingly able to emulate many functions of the human nervous system. We examine the energy constraints of neuronal signaling within biology, review the quantitative tradeoff between energy use and information processing, and ask whether the biophysics and design of nerve cells minimizes energy consumption.


Proceedings of the IEEE | 2014

Cognitive Dynamics: From Attractors to Active Inference

K. J. Friston; Biswa Sengupta; Gennaro Auletta

This paper combines recent formulations of self-organization and neuronal processing to provide an account of cognitive dynamics from basic principles. We start by showing that inference (and autopoiesis) are emergent features of any (weakly mixing) ergodic random dynamical system. We then apply the emergent dynamics to action and perception in a way that casts action as the fulfillment of (Bayesian) beliefs about the causes of sensations. More formally, we formulate ergodic flows on global random attractors as a generalized descent on a free energy functional of the internal states of a system. This formulation rests on a partition of states based on a Markov blanket that separates internal states from hidden states in the external milieu. This separation means that the internal states effectively represent external states probabilistically. The generalized descent is then related to classical Bayesian (e.g., Kalman-Bucy) filtering and predictive coding-of the sort that might be implemented in the brain. Finally, we present two simulations. The first simulates a primordial soup to illustrate the emergence of a Markov blanket and (active) inference about hidden states. The second uses the same emergent dynamics to simulate action and action observation.


Physical Review E | 2010

Comparison of Langevin and Markov channel noise models for neuronal signal generation

Biswa Sengupta; Simon B. Laughlin; Jeremy E. Niven

The stochastic opening and closing of voltage-gated ion channels produce noise in neurons. The effect of this noise on the neuronal performance has been modeled using either an approximate or Langevin model based on stochastic differential equations or an exact model based on a Markov process model of channel gating. Yet whether the Langevin model accurately reproduces the channel noise produced by the Markov model remains unclear. Here we present a comparison between Langevin and Markov models of channel noise in neurons using single compartment Hodgkin-Huxley models containing either Na+ and K+, or only K+ voltage-gated ion channels. The performance of the Langevin and Markov models was quantified over a range of stimulus statistics, membrane areas, and channel numbers. We find that in comparison to the Markov model, the Langevin model underestimates the noise contributed by voltage-gated ion channels, overestimating information rates for both spiking and nonspiking membranes. Even with increasing numbers of channels, the difference between the two models persists. This suggests that the Langevin model may not be suitable for accurately simulating channel noise in neurons, even in simulations with large numbers of ion channels.


PLOS Computational Biology | 2014

Ten Simple Rules for Effective Computational Research

James M. Osborne; Miguel O. Bernabeu; Maria Bruna; Ben Calderhead; Jonathan Cooper; Neil Dalchau; Sara-Jane Dunn; Alexander G. Fletcher; Robin Freeman; Derek Groen; Bernhard Knapp; Greg J. McInerny; Gary R. Mirams; Joe Pitt-Francis; Biswa Sengupta; David W. Wright; Christian A. Yates; David J. Gavaghan; Stephen Emmott; Charlotte M. Deane

In order to attempt to understand the complexity inherent in nature, mathematical, statistical and computational techniques are increasingly being employed in the life sciences. In particular, the use and development of software tools is becoming vital for investigating scientific hypotheses, and a wide range of scientists are finding software development playing a more central role in their day-to-day research. In fields such as biology and ecology, there has been a noticeable trend towards the use of quantitative methods for both making sense of ever-increasing amounts of data [1] and building or selecting models [2]. As Research Fellows of the “2020 Science” project (http://www.2020science.net), funded jointly by the EPSRC (Engineering and Physical Sciences Research Council) and Microsoft Research, we have firsthand experience of the challenges associated with carrying out multidisciplinary computation-based science [3]–[5]. In this paper we offer a jargon-free guide to best practice when developing and using software for scientific research. While many guides to software development exist, they are often aimed at computer scientists [6] or concentrate on large open-source projects [7]; the present guide is aimed specifically at the vast majority of scientific researchers: those without formal training in computer science. We present our ten simple rules with the aim of enabling scientists to be more effective in undertaking research and therefore maximise the impact of this research within the scientific community. While these rules are described individually, collectively they form a single vision for how to approach the practical side of computational science. Our rules are presented in roughly the chronological order in which they should be undertaken, beginning with things that, as a computational scientist, you should do before you even think about writing any code. For each rule, guides on getting started, links to relevant tutorials, and further reading are provided in the supplementary material (Text S1).

Collaboration


Dive into the Biswa Sengupta's collaboration.

Top Co-Authors

Avatar

K. J. Friston

University College London

View shared research outputs
Top Co-Authors

Avatar

Gerald K. Cooray

Karolinska University Hospital

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

William D. Penny

Wellcome Trust Centre for Neuroimaging

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Marita Englund

Karolinska University Hospital

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Arturo Tozzi

University of North Texas

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge