Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Karim El-Laithy is active.

Publication


Featured researches published by Karim El-Laithy.


international conference on artificial neural networks | 2010

Simulating biological-inspired spiking neural networks with OpenCL

Jörn Hoffmann; Karim El-Laithy; Frank Güttler; Martin Bogdan

The algorithms used for simulating biologically-inspired spiking neural networks (BIANN) often utilize functions which are computationally complex and have to model a large number of neurons - or even a much larger number of synapses in parallel. To use all available computing resources provided by a standard desktop PC is an opportunity to shorten the simulation time and extend the number of simulated neurons and their interconnections. OpenCL offers an open platform for heterogeneous computing to employ CPUs, GPUs, DSP or FPGAs in an uniform way. This paper introduces a handy simulation framework being sufficient to accelerate different kinds of neural networks with off-the-shelf hardware. To illustrate this, different large networks comprising a complex synaptic model in combination with a leaky Integrate-and-Fire neuron model are implemented as standard Matlab code and with OpenCL separately. In comparison to the Matlab model, OpenCL reaches a speedup of ~ 83 on a quad-core processor and of ∼ 1500 on a GPU.


international conference on artificial neural networks | 2009

Synchrony State Generation in Artificial Neural Networks with Stochastic Synapses

Karim El-Laithy; Martin Bogdan

In this study, the generation of temporal synchrony within an artificial neural network is examined considering a stochastic synaptic model. A network is introduced and driven by Poisson distributed trains of spikes along with white-Gaussian noise that is added to the internal synaptic activity representing the background activity (neuronal noise). A Hebbian-based learning rule for the update of synaptic parameters is introduced. Only arbitrarily selected synapses are allowed to learn, i.e. change parameter values. The average of the cross-correlation coefficients between a smoothed version of the responses of all the neurons is taken as an indicator for synchrony. Results show that a network using such a framework is able to achieve different states of synchrony via learning. Thus, the plausibility of using stochastic-based models in modeling the neural process is supported. It is also consistent with arguments claiming that synchrony is a part of the memory-recall process and copes with the accepted framework in biological neural systems.


Computational Intelligence and Neuroscience | 2011

A reinforcement learning framework for spiking networks with dynamic synapses

Karim El-Laithy; Martin Bogdan

An integration of both the Hebbian-based and reinforcement learning (RL) rules is presented for dynamic synapses. The proposed framework permits the Hebbian rule to update the hidden synaptic model parameters regulating the synaptic response rather than the synaptic weights. This is performed using both the value and the sign of the temporal difference in the reward signal after each trial. Applying this framework, a spiking network with spike-timing-dependent synapses is tested to learn the exclusive-OR computation on a temporally coded basis. Reward values are calculated with the distance between the output spike train of the network and a reference target one. Results show that the network is able to capture the required dynamics and that the proposed framework can reveal indeed an integrated version of Hebbian and RL. The proposed framework is tractable and less computationally expensive. The framework is applicable to a wide class of synaptic models and is not restricted to the used neural representation. This generality, along with the reported results, supports adopting the introduced approach to benefit from the biologically plausible synaptic models in a wide range of intuitive signal processing.


Journal of Neuroscience Methods | 2012

Digital detection and analysis of branching and cell contacts in neural cell cultures

Karim El-Laithy; Melanie Knorr; Josef A. Käs; Martin Bogdan

Changes in human/animal behaviour and the involved neural functions are characterized by structural alterations in the brain circuitry. These changes comprise the formation of new synapses and the elimination of existing synapses aside from the modulation of connecting properties within other ones. The mechanisms of neuronal branching and cell contacting regulate and prepare for the processes of synaptic formation. In this study, we present a set of methods to detect, describe and analyse the dynamics attributed to the process of cell contacting in cell cultures in vitro. This involves the dynamics of branching and seeking for synaptic partners. The proposed technique formally distinguishes between the actual formed synapses and the potential synaptic sites, i.e. where cell contacts are likely. The study investigates the dynamic behaviour of these potential synaptic sites within the process of seeking for contacts. The introduced tools use morphological image processing algorithms to automatically detect the sites of interest. Results indicate that the introduced tools can reliably describe experimentally observed branching and seeking for contacts dynamics. Being straightforward in terms of implementation and analysis, our framework represents a solid method for studying the neural preparation phases of synaptic formation via cell contacting in random networks using standard phase contrast microscopy.


international conference on artificial neural networks | 2011

On the capacity of transient internal states in liquid-state machines

Karim El-Laithy; Martin Bogdan

Liquid-state machines (LSM) represent a class of neural networks that are able to introduce multitasking by implicit representation of input information over the entire network components. How exactly the input information can be represented and how the computations are accomplished, stay however unresolved. In order to tackle this issue, we demonstrate how LSM can process different input information as a varying set of transiently stable states of collective activity. This is performed by adopting a relatively complex dynamic synaptic model. Some light is shed on the relevance of the usage of the developed framework to mimic complex cortical functions, e.g. content-addressable memory.


international conference on artificial neural networks | 2011

A hypothetical free synaptic energy function and related states of synchrony

Karim El-Laithy; Martin Bogdan

A simple hypothetical energy function is proposed for a dynamic synaptic model. It is an approach based on the theoretical thermodynamic principles that are conceptually similar to the Hopfield ones. We show that using this approach a synapse exposes stable operating points in terms of its excitatory postsynaptic potential (EPSP) as a function of its synaptic strength. We postulate that synapses in a network operating at these stable points can drive this network to an internal state of synchronous firing. The presented analysis is related to the widely investigated temporal coherent activities (cell assemblies) over a certain range of time scales (binding-by-synchrony). The results illustrate that a synaptic dynamical model has more than one stable operating point regarding the postsynaptic energy transfer. This proposes a novel explanation of the observed synchronous activities within networks regarding the synaptic (coupling) functionality.


international conference on artificial neural networks | 2010

A Hebbian-based reinforcement learning framework for spike-timing-dependent synapses

Karim El-Laithy; Martin Bogdan

In this study a combination of both the Hebbian-based and reinforcement learning rule is presented. The concept permits the Hebbian rules to update the values of the synaptic parameters using both the value and the sign supplied by a reward value at any time instant. The latter is calculated as the distance between the output of the network and a reference signal. The network is a spiking neural network with spike-timing-dependent synapses. It is tested to learn the XOR computations on a temporally-coded basis. Results show that the network is able to capture the required dynamics and that the proposed framework can reveal indeed an integrated version of both Hebbian and reinforcement learning. This supports adopting the introduced approach for intuitive signal processing and computations.


international conference on artificial neural networks | 2012

Cyfield-RISP: generating dynamic instruction set processors for reconfigurable hardware using OpenCL

Jörn Hoffmann; Frank Güttler; Karim El-Laithy; Martin Bogdan

In this work a novel approach to automatically generate hardware is introduced that allows accelerated simulation of artificial neural networks (ANN) on field-programming gate arrays (FPGAs). A compiler architecture has been designed that primarily aims at reducing the development effort for non-hardware developers. This is done by implementing automatic generation of accordingly adjusted hardware processors. Deduced from high-level OpenCL source code, the processors are able to spatially map ANNs in a massive parallel fashion.


soft computing | 2011

Synchrony state generation : an approach using stochastic synapses

Karim El-Laithy; Martin Bogdan


the european symposium on artificial neural networks | 2010

Predicting spike-timing of a thalamic neuron using a stochastic synaptic model

Karim El-Laithy; Martin Bogdan

Collaboration


Dive into the Karim El-Laithy's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge