Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Florian Meier is active.

Publication


Featured researches published by Florian Meier.


International Journal of Neural Systems | 2017

Long Synfire Chains Emerge by Spike-Timing Dependent Plasticity Modulated by Population Activity

Felix Weissenberger; Florian Meier; Johannes Lengler; Hafsteinn Einarsson; Angelika Steger

Sequences of precisely timed neuronal activity are observed in many brain areas in various species. Synfire chains are a well-established model that can explain such sequences. However, it is unknown under which conditions synfire chains can develop in initially unstructured networks by self-organization. This work shows that with spike-timing dependent plasticity (STDP), modulated by global population activity, long synfire chains emerge in sparse random networks. The learning rule fosters neurons to participate multiple times in the chain or in multiple chains. Such reuse of neurons has been experimentally observed and is necessary for high capacity. Sparse networks prevent the chains from being short and cyclic and show that the formation of specific synapses is not essential for chain formation. Analysis of the learning rule in a simple network of binary threshold neurons reveals the asymptotically optimal length of the emerging chains. The theoretical results generalize to simulated networks of conductance-based leaky integrate-and-fire (LIF) neurons. As an application of the emerged chain, we propose a one-shot memory for sequences of precisely timed neuronal activity.


genetic and evolutionary computation conference | 2018

The linear hidden subset problem for the (1 + 1) EA with scheduled and adaptive mutation rates

Hafsteinn Einarsson; Johannes Lengler; Marcelo Matheus Gauy; Florian Meier; Asier Mujika; Angelika Steger; Felix Weissenberger

We study unbiased (1 + 1) evolutionary algorithms on linear functions with an unknown number n of bits with non-zero weight. Static algorithms achieve an optimal runtime of O(n(ln n)2+ε), however, it remained unclear whether more dynamic parameter policies could yield better runtime guarantees. We consider two setups: one where the mutation rate follows a fixed schedule, and one where it may be adapted depending on the history of the run. For the first setup, we give a schedule that achieves a runtime of (1±o(1))βn ln n, where β ≈ 3.552, which is an asymptotic improvement over the runtime of the static setup. Moreover, we show that no schedule admits a better runtime guarantee and that the optimal schedule is essentially unique. For the second setup, we show that the runtime can be further improved to (1 ± o(1))en ln n, which matches the performance of algorithms that know n in advance. Finally, we study the related model of initial segment uncertainty with static position-dependent mutation rates, and derive asymptotically optimal lower bounds. This answers a question by Doerr, Doerr, and Kötzing.


bioRxiv | 2018

A hippocampal model for behavioral time acquisition and fast bidirectional replay of spatio-temporal memory sequences

Marcelo Matheus Gauy; Johannes Lengler; Hafsteinn Einarsson; Florian Meier; Felix Weissenberger; Mehmet Fatih Yanik; Angelika Steger

The hippocampus is known to play a crucial role in the formation of long-term memory. For this, fast replays of previously experienced activities during sleep or after reward experiences are believed to be crucial. But how such replays are generated is still completely unclear. In this paper we propose a possible mechanism for this: we present a model that can store experienced trajectories on a behavioral timescale after a single run, and can subsequently bidirectionally replay such trajectories, thereby omitting any specifics of the previous behavior like speed, etc, but allowing repetitions of events, even with different subsequent events. Our solution builds on well-known concepts, one-shot learning and synfire chains, enhancing them by additional mechanisms using global inhibition and disinhibition. For replays our approach relies on dendritic spikes and cholinergic modulation, as supported by experimental data. We also hypothesize a functional role of disinhibition as a pacemaker during behavioral time.


arXiv: Discrete Mathematics | 2018

Even Flying Cops Should Think Ahead

Anders Martinsson; Florian Meier; Patrick Schnider; Angelika Steger

We study the entanglement game, which is a version of cops and robbers, on sparse graphs. While the minimum degree of a graph G is a lower bound for the number of cops needed to catch a robber in G, we show that the required number of cops can be much larger, even for graphs with small maximum degree. In particular, we show that there are 3-regular graphs where a linear number of cops are needed.


Scientific Reports | 2018

Voltage dependence of synaptic plasticity is essential for rate based learning with short stimuli

Felix Weissenberger; Marcelo Matheus Gauy; Johannes Lengler; Florian Meier; Angelika Steger

In computational neuroscience, synaptic plasticity rules are often formulated in terms of firing rates. The predominant description of in vivo neuronal activity, however, is the instantaneous rate (or spiking probability). In this article we resolve this discrepancy by showing that fluctuations of the membrane potential carry enough information to permit a precise estimate of the instantaneous rate in balanced networks. As a consequence, we find that rate based plasticity rules are not restricted to neuronal activity that is stable for hundreds of milliseconds to seconds, but can be carried over to situations in which it changes every few milliseconds. We illustrate this, by showing that a voltage-dependent realization of the classical BCM rule achieves input selectivity, even if stimulus duration is reduced to a few milliseconds each.


Hippocampus | 2018

On the origin of lognormal network synchrony in CA1

Felix Weissenberger; Hafsteinn Einarsson; Marcelo Matheus Gauy; Florian Meier; Asier Mujika; Johannes Lengler; Angelika Steger

The sharp wave ripple complex in rodent hippocampus is associated with a network burst in CA3 (NB) that triggers a synchronous event in the CA1 population (SE). The number of CA1 pyramidal cells participating in a SE has been observed to follow a lognormal distribution. However, the origin of this skewed and heavy‐tailed distribution of population synchrony in CA1 remains unknown. Because the size of SEs is likely to originate from the size of the NBs and the underlying neural circuitry, we model the CA3‐CA1 circuit to study the underlying mechanisms and their functional implications. We show analytically that if the size of a NB in CA3 is distributed according to a normal distribution, then the size of the resulting SE in CA1 follows a lognormal distribution. Our model predicts the distribution of the NB size in CA3, which remains to be tested experimentally. Moreover, we show that a putative lognormal NB size distribution leads to an extremely heavy‐tailed SE size distribution in CA1, contradicting experimental evidence. In conclusion, our model provides general insight on the origin of lognormally distributed network synchrony as a consequence of synchronous synaptic transmission of normally distributed input events.


Neural Computation | 2017

Multiassociative Memory: Recurrent Synapses Increase Storage Capacity

Marcelo Matheus Gauy; Florian Meier; Angelika Steger

The connection density of nearby neurons in the cortex has been observed to be around 0.1, whereas the longer-range connections are present with much sparser density (Kalisman, Silberberg, & Markram, 2005). We propose a memory association model that qualitatively explains these empirical observations. The model we consider is a multiassociative, sparse, Willshaw-like model consisting of binary threshold neurons and binary synapses. It uses recurrent synapses for iterative retrieval of stored memories. We quantify the usefulness of recurrent synapses by simulating the model for small network sizes and by doing a precise mathematical analysis for large network sizes. Given the network parameters, we can determine the precise values of recurrent and afferent synapse densities that optimize the storage capacity of the network. If the network size is like that of a cortical column, then the predicted optimal recurrent density lies in a range that is compatible with biological measurements. Furthermore, we show that our model is able to surpass the standard Willshaw model in the multiassociative case if the information capacity is normalized per strong synapse or per bits required to store the model, as considered in Knoblauch, Palm, and Sommer (2010).


neural information processing systems | 2017

Fast-Slow Recurrent Neural Networks

Asier Mujika; Florian Meier; Angelika Steger


Theoretical Computer Science | 2018

Asymptotically optimal amplifiers for the Moran process

Leslie Ann Goldberg; John Lapinskas; Johannes Lengler; Florian Meier; Konstantinos Panagiotou; Pascal Pfister


neural information processing systems | 2018

Approximating Real-Time Recurrent Learning with Random Kronecker Factors

Asier Mujika; Florian Meier; Angelika Steger

Collaboration


Dive into the Florian Meier's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge