Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jakob Jordan is active.

Publication


Featured researches published by Jakob Jordan.


Frontiers in Neuroinformatics | 2018

Extremely Scalable Spiking Neuronal Network Simulation Code: From Laptops to Exascale Computers

Jakob Jordan; Jun Igarashi; Markus Diesmann; Tammo Ippen; Moritz Helias; Mitsuhisa Sato; Itaru Kitayama; Susanne Kunkel

State-of-the-art software tools for neuronal network simulations scale to the largest computing systems available today and enable investigations of large-scale networks of up to 10 % of the human cortex at a resolution of individual neurons and synapses. Due to an upper limit on the number of incoming connections of a single neuron, network connectivity becomes extremely sparse at this scale. To manage computational costs, simulation software ultimately targeting the brain scale needs to fully exploit this sparsity. Here we present a two-tier connection infrastructure and a framework for directed communication among compute nodes accounting for the sparsity of brain-scale networks. We demonstrate the feasibility of this approach by implementing the technology in the NEST simulation code and we investigate its performance in different scaling scenarios of typical network simulations. Our results show that the new data structures and communication scheme prepare the simulation kernel for post-petascale high-performance computing facilities without sacrificing performance in smaller systems.


Frontiers in Neuroinformatics | 2018

Corrigendum: Extremely Scalable Spiking Neuronal Network Simulation Code: From Laptops to Exascale Computers

Jakob Jordan; Tammo Ippen; Moritz Helias; Itaru Kitayama; Mitsuhisa Sato; Jun Igarashi; Markus Diesmann; Susanne Kunkel

[This corrects the article DOI: 10.3389/fninf.2018.00002.].


BMC Neuroscience | 2015

Deterministic neural networks as sources of uncorrelated noise for probabilistic computations

Jakob Jordan; Tom Tetzlaff; Mihai A. Petrovici; Oliver Breitwieser; Ilja Bytschok; Johannes Bill; Johannes Schemmel; K. Meier; Markus Diesmann

Neural-network models of brain function often rely on the presence of noise [1-4]. To date, the interplay of microscopic noise sources and network function is only poorly understood. In computer simulations and in neuromorphic hardware [5-7], the number of noise sources (random-number generators) is limited. In consequence, neurons in large functional network models have to share noise sources and are therefore correlated. In general, it is unclear how shared-noise correlations affect the performance of functional network models. Further, there is so far no solution to the problem of how a limited number of noise sources can supply a large number of functional units with uncorrelated noise. Here, we investigate the performance of neural Boltzmann machines [2-4]. We show that correlations in the background activity are detrimental to the sampling performance and that the deviations from the target distribution scale inversely with the number of noise sources. Further, we show that this problem can be overcome by replacing the finite ensemble of independent noise sources by a recurrent neural network with the same number of units. As shown recently, inhibitory feedback, abundant in biological neural networks, serves as a powerful decorrelation mechanism [8,9]: Shared-noise correlations are actively suppressed by the network dynamics. By exploiting this effect, the network performance is significantly improved. Hence, recurrent neural networks can serve as natural finite-size noise sources for functional neural networks, both in biological and in synthetic neuromorphic substrates. Finally we investigate the impact of sampling network parameters on its ability to faithfully represent a given well-defined distribution. We show that sampling networks with sufficiently strong negative feedback can intrinsically suppress correlations in the background activity, and thereby improve their performance substantially.


Archive | 2017

Nest 2.12.0

Susanne Kunkel; Rajalekshmi Deepu; Hans E. Plesser; Bruno Golosio; Mikkel Elle Lepperød; Jochen Martin Eppler; Sepehr Mahmoudian; Jan Hahne; Dimitri Plotnikov; Claudia Bachmann; Alexander Peyser; Tanguy Fardet; Till Schumann; Jakob Jordan; Ankur Sinha; Oliver Breitwieser; Abigail Morrison; Tammo Ippen; Hendrik Rothe; Steffen Graber; Hesam Setareh; Jesús Garrido; Dennis Terhorst; Alexey Shusharin; Hannah Bos; Arjun Rao; Alex Seeholzer; Mikael Djurfeldt; Maximilian Schmidt; Stine Brekke Vennemo


Physical Review X | 2016

Effect of Heterogeneity on Decorrelation Mechanisms in Spiking Neural Networks: A Neuromorphic-Hardware Study

Thomas Pfeil; Jakob Jordan; Andreas Grübl; Markus Diesmann; Johannes Schemmel; Tom Tetzlaff; K. Meier


Archive | 2015

NEST 2.8.0

Jochen Martin Eppler; Rajalekshmi Deepu; Claudia Bachmann; Tiziano Zito; Alexander Peyser; Jakob Jordan; Robin Pauli; Luis Riquelme; Sacha J. van Albada; Abigail Morrison; Tammo Ippen; Moritz Helias; Hesam Setareh; Marc-Oliver Gewaltig; Hannah Bos; Frank Michler; Ali Shirvani; Renato Duarte; Maximilian Schmidt; Espen Hagen; Jannis Schuecker; Wolfram Schenck; Moritz Deger; Hans E. Plesser; Susanne Kunkel; Johanna Senk


Archive | 2017

Stochastic neural computation without noise

Jakob Jordan; Mihai A. Petrovici; Oliver Breitwieser; Johannes Schemmel; K. Meier; Markus Diesmann; Tom Tetzlaff


Archive | 2017

Deterministic networks for probabilistic computing

Jakob Jordan; Mihai A. Petrovici; Oliver Breitwieser; Johannes Schemmel; K. Meier; Markus Diesmann; Tom Tetzlaff


Archive | 2017

Stochastic neural computing without random numbers

Jakob Jordan; Oliver Breitwieser; Mihai A. Petrovici; Markus Diesmann; Johannes Schemmel; Tom Tetzlaff; Karlsheinz Meier


Archive | 2017

Closing the loop between neural network simulators and the OpenAI Gym

Jakob Jordan; Philipp Weidel; Abigail Morrison

Collaboration


Dive into the Jakob Jordan's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Tammo Ippen

Norwegian University of Life Sciences

View shared research outputs
Top Co-Authors

Avatar

Tom Tetzlaff

Norwegian University of Life Sciences

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

K. Meier

Heidelberg University

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge