Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Benjamin Torben-Nielsen is active.

Publication


Featured researches published by Benjamin Torben-Nielsen.


Frontiers in Neuroinformatics | 2013

Self-referential forces are sufficient to explain different dendritic morphologies

Heraldo Memelli; Benjamin Torben-Nielsen; James R. Kozloski

Dendritic morphology constrains brain activity, as it determines first which neuronal circuits are possible and second which dendritic computations can be performed over a neurons inputs. It is known that a range of chemical cues can influence the final shape of dendrites during development. Here, we investigate the extent to which self-referential influences, cues generated by the neuron itself, might influence morphology. To this end, we developed a phenomenological model and algorithm to generate virtual morphologies, which are then compared to experimentally reconstructed morphologies. In the model, branching probability follows a Galton–Watson process, while the geometry is determined by “homotypic forces” exerting influence on the direction of random growth in a constrained space. We model three such homotypic forces, namely an inertial force based on membrane stiffness, a soma-oriented tropism, and a force of self-avoidance, as directional biases in the growth algorithm. With computer simulations we explored how each bias shapes neuronal morphologies. We show that based on these principles, we can generate realistic morphologies of several distinct neuronal types. We discuss the extent to which homotypic forces might influence real dendritic morphologies, and speculate about the influence of other environmental cues on neuronal shape and circuitry.


Nature Neuroscience | 2015

Centrosomin represses dendrite branching by orienting microtubule nucleation

Cagri Yalgin; Saman Ebrahimi; Caroline Delandre; Li Foong Yoong; Saori Akimoto; Heidi Tran; Reiko Amikura; Rebecca Spokony; Benjamin Torben-Nielsen; Kevin P. White; Adrian W. Moore

Neuronal dendrite branching is fundamental for building nervous systems. Branch formation is genetically encoded by transcriptional programs to create dendrite arbor morphological diversity for complex neuronal functions. In Drosophila sensory neurons, the transcription factor Abrupt represses branching via an unknown effector pathway. Targeted screening for branching-control effectors identified Centrosomin, the primary centrosome-associated protein for mitotic spindle maturation. Centrosomin repressed dendrite branch formation and was used by Abrupt to simplify arbor branching. Live imaging revealed that Centrosomin localized to the Golgi cis face and that it recruited microtubule nucleation to Golgi outposts for net retrograde microtubule polymerization away from nascent dendrite branches. Removal of Centrosomin enabled the engagement of wee Augmin activity to promote anterograde microtubule growth into the nascent branches, leading to increased branching. The findings reveal that polarized targeting of Centrosomin to Golgi outposts during elaboration of the dendrite arbor creates a local system for guiding microtubule polymerization.


Frontiers in Computational Neuroscience | 2010

An Inverse Approach for Elucidating Dendritic Function

Benjamin Torben-Nielsen; Klaus M. Stiefel

We outline an inverse approach for investigating dendritic function–structure relationships by optimizing dendritic trees for a priori chosen computational functions. The inverse approach can be applied in two different ways. First, we can use it as a “hypothesis generator” in which we optimize dendrites for a function of general interest. The optimization yields an artificial dendrite that is subsequently compared to real neurons. This comparison potentially allows us to propose hypotheses about the function of real neurons. In this way, we investigated dendrites that optimally perform input-order detection. Second, we can use it as a “function confirmation” by optimizing dendrites for functions hypothesized to be performed by classes of neurons. If the optimized, artificial, dendrites resemble the dendrites of real neurons the artificial dendrites corroborate the hypothesized function of the real neuron. Moreover, properties of the artificial dendrites can lead to predictions about yet unmeasured properties. In this way, we investigated wide-field motion integration performed by the VS cells of the fly visual system. In outlining the inverse approach and two applications, we also elaborate on the nature of dendritic function. We furthermore discuss the role of optimality in assigning functions to dendrites and point out interesting future directions.


Frontiers in Neuroinformatics | 2010

A comparison of methods to determine neuronal phase-response curves

Benjamin Torben-Nielsen; Marylka Uusisaari; Klaus M. Stiefel

The phase-response curve (PRC) is an important tool to determine the excitability type of single neurons which reveals consequences for their synchronizing properties. We review five methods to compute the PRC from both model data and experimental data and compare the numerically obtained results from each method. The main difference between the methods lies in the reliability which is influenced by the fluctuations in the spiking data and the number of spikes available for analysis. We discuss the significance of our results and provide guidelines to choose the best method based on the available data.


PLOS Computational Biology | 2012

The generation of phase differences and frequency changes in a network model of inferior olive subthreshold oscillations

Benjamin Torben-Nielsen; Idan Segev; Yosef Yarom

It is commonly accepted that the Inferior Olive (IO) provides a timing signal to the cerebellum. Stable subthreshold oscillations in the IO can facilitate accurate timing by phase-locking spikes to the peaks of the oscillation. Several theoretical models accounting for the synchronized subthreshold oscillations have been proposed, however, two experimental observations remain an enigma. The first is the observation of frequent alterations in the frequency of the oscillations. The second is the observation of constant phase differences between simultaneously recorded neurons. In order to account for these two observations we constructed a canonical network model based on anatomical and physiological data from the IO. The constructed network is characterized by clustering of neurons with similar conductance densities, and by electrical coupling between neurons. Neurons inside a cluster are densely connected with weak strengths, while neurons belonging to different clusters are sparsely connected with stronger connections. We found that this type of network can robustly display stable subthreshold oscillations. The overall frequency of the network changes with the strength of the inter-cluster connections, and phase differences occur between neurons of different clusters. Moreover, the phase differences provide a mechanistic explanation for the experimentally observed propagating waves of activity in the IO. We conclude that the architecture of the network of electrically coupled neurons in combination with modulation of the inter-cluster coupling strengths can account for the experimentally observed frequency changes and the phase differences.


Network: Computation In Neural Systems | 2009

Systematic mapping between dendritic function and structure

Benjamin Torben-Nielsen; Klaus M. Stiefel

For many classes of neurons, the relationship between computational function and dendritic morphology remains unclear. To gain insights into this relationship, we utilize an inverse approach in which we optimize model neurons with realistic morphologies and ion channel distributions (of IKA and ICaT) to perform a computational function. In this study, the desired function is input-order detection: neurons have to respond differentially to the arrival of two inputs in a different temporal order. There is a single free parameter in this function, namely, the time lag between the arrivals of the two inputs. Systematically varying this parameter allowed us to map one axis of function space to structure space. Because the function of the optimized model neurons is known with certainty, their thorough analysis provides insights into the relationship between the neurons’ functions, morphologies, ion channel distributions, and electrophysiological dynamics. Finally, we discuss issues of optimality in nervous systems.


Frontiers in Neuroanatomy | 2014

Context-aware modeling of neuronal morphologies.

Benjamin Torben-Nielsen; Erik De Schutter

Neuronal morphologies are pivotal for brain functioning: physical overlap between dendrites and axons constrain the circuit topology, and the precise shape and composition of dendrites determine the integration of inputs to produce an output signal. At the same time, morphologies are highly diverse and variant. The variance, presumably, originates from neurons developing in a densely packed brain substrate where they interact (e.g., repulsion or attraction) with other actors in this substrate. However, when studying neurons their context is never part of the analysis and they are treated as if they existed in isolation. Here we argue that to fully understand neuronal morphology and its variance it is important to consider neurons in relation to each other and to other actors in the surrounding brain substrate, i.e., their context. We propose a context-aware computational framework, NeuroMaC, in which large numbers of neurons can be grown simultaneously according to growth rules expressed in terms of interactions between the developing neuron and the surrounding brain substrate. As a proof of principle, we demonstrate that by using NeuroMaC we can generate accurate virtual morphologies of distinct classes both in isolation and as part of neuronal forests. Accuracy is validated against population statistics of experimentally reconstructed morphologies. We show that context-aware generation of neurons can explain characteristics of variation. Indeed, plausible variation is an inherent property of the morphologies generated by context-aware rules. We speculate about the applicability of this framework to investigate morphologies and circuits, to classify healthy and pathological morphologies, and to generate large quantities of morphologies for large-scale modeling.


Neuroinformatics | 2008

Non-parametric Algorithmic Generation of Neuronal Morphologies

Benjamin Torben-Nielsen; Stijn Vanderlooy; Eric O. Postma

Generation algorithms allow for the generation of Virtual Neurons (VNs) from a small set of morphological properties. The set describes the morphological properties of real neurons in terms of statistical descriptors such as the number of branches and segment lengths (among others). The majority of reconstruction algorithms use the observed properties to estimate the parameters of a priori fixed probability distributions in order to construct statistical descriptors that fit well with the observed data. In this article, we present a non-parametric generation algorithm based on kernel density estimators (KDEs). The new algorithm is called KDE-Neuron and has three advantages over parametric reconstruction algorithms: (1) no a priori specifications about the distributions underlying the real data, (2) peculiarities in the biological data will be reflected in the VNs, and (3) ability to reconstruct different cell types. We experimentally generated motor neurons and granule cells, and statistically validated the obtained results. Moreover, we assessed the quality of the prototype data set and observed that our generated neurons are as good as the prototype data in terms of the used statistical descriptors. The opportunities and limitations of data-driven algorithmic reconstruction of neurons are discussed.


Frontiers in Systems Neuroscience | 2013

Oscillatory activity, phase differences, and phase resetting in the inferior olivary nucleus

Yaara Lefler; Benjamin Torben-Nielsen; Yosef Yarom

The generation of temporal patterns is one of the most fascinating functions of the brain. Unlike the response to external stimuli temporal patterns are generated within the system and recalled for a specific use. To generate temporal patterns one needs a timing machine, a “master clock” that determines the temporal framework within which temporal patterns can be generated and implemented. Here we present the concept that in this putative “master clock” phase and frequency interact to generate temporal patterns. We define the requirements for a neuronal “master clock” to be both reliable and versatile. We introduce this concept within the inferior olive nucleus which at least by some scientists is regarded as the source of timing for cerebellar function. We review the basic properties of the subthreshold oscillation recorded from olivary neurons, analyze the phase relationships between neurons and demonstrate that the phase and onset of oscillation is tightly controlled by synaptic input. These properties endowed the olivary nucleus with the ability to act as a “master clock.”


PLOS Computational Biology | 2014

Spatially distributed dendritic resonance selectively filters synaptic input

Jonathan Laudanski; Benjamin Torben-Nielsen; Idan Segev; Shihab A. Shamma

An important task performed by a neuron is the selection of relevant inputs from among thousands of synapses impinging on the dendritic tree. Synaptic plasticity enables this by strenghtening a subset of synapses that are, presumably, functionally relevant to the neuron. A different selection mechanism exploits the resonance of the dendritic membranes to preferentially filter synaptic inputs based on their temporal rates. A widely held view is that a neuron has one resonant frequency and thus can pass through one rate. Here we demonstrate through mathematical analyses and numerical simulations that dendritic resonance is inevitably a spatially distributed property; and therefore the resonance frequency varies along the dendrites, and thus endows neurons with a powerful spatiotemporal selection mechanism that is sensitive both to the dendritic location and the temporal structure of the incoming synaptic inputs.

Collaboration


Dive into the Benjamin Torben-Nielsen's collaboration.

Top Co-Authors

Avatar

Klaus M. Stiefel

Okinawa Institute of Science and Technology

View shared research outputs
Top Co-Authors

Avatar

Erik De Schutter

Okinawa Institute of Science and Technology

View shared research outputs
Top Co-Authors

Avatar

Idan Segev

Hebrew University of Jerusalem

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Yosef Yarom

Hebrew University of Jerusalem

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Marylka Uusisaari

Okinawa Institute of Science and Technology

View shared research outputs
Top Co-Authors

Avatar

Yaara Lefler

Hebrew University of Jerusalem

View shared research outputs
Top Co-Authors

Avatar

Klaus M. Stiefel

Okinawa Institute of Science and Technology

View shared research outputs
Top Co-Authors

Avatar

Marc-Oliver Gewaltig

École Polytechnique Fédérale de Lausanne

View shared research outputs
Researchain Logo
Decentralizing Knowledge