Atilim Gunes Baydin
Maynooth University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Atilim Gunes Baydin.
Paladyn | 2012
Atilim Gunes Baydin
Central pattern generators (CPGs), with a basis is neurophysiological studies, are a type of neural network for the generation of rhythmic motion. While CPGs are being increasingly used in robot control, most applications are handtuned for a specific task and it is acknowledged in the field that generic methods and design principles for creating individual networks for a given task are lacking. This study presents an approach where the connectivity and oscillatory parameters of a CPG network are determined by an evolutionary algorithm with fitness evaluations in a realistic simulation with accurate physics. We apply this technique to a five-link planar walking mechanism to demonstrate its feasibility and performance. In addition, to see whether results from simulation can be acceptably transferred to real robot hardware, the best evolved CPG network is also tested on a real mechanism. Our results also confirm that the biologically inspired CPG model is well suited for legged locomotion, since a diverse manifestation of networks have been observed to succeed in fitness simulations during evolution.
international symposium on neural networks | 2017
Tuan Anh Le; Atilim Gunes Baydin; Robert Zinkov; Frank D. Wood
We draw a formal connection between using synthetic training data to optimize neural network parameters and approximate, Bayesian, model-based reasoning. In particular, training a neural network using synthetic data can be viewed as learning a proposal distribution generator for approximate inference in the synthetic-data generative model. We demonstrate this connection in a recognition task where we develop a novel Captcha-breaking architecture and train it using synthetic data, demonstrating both state-of-the-art performance and a way of computing task-specific posterior uncertainty. Using a neural network trained this way, we also demonstrate successful breaking of real-world Captchas currently used by Facebook and Wikipedia. Reasoning from these empirical results and drawing connections with Bayesian modeling, we discuss the robustness of synthetic data results and suggest important considerations for ensuring good neural network generalization when training with synthetic data.
congress on evolutionary computation | 2012
Atilim Gunes Baydin; Ramon López de Mántaras
This paper presents a new type of evolutionary algorithm (EA) based on the concept of “meme”, where the individuals forming the population are represented by semantic networks and the fitness measure is defined as a function of the represented knowledge. Our work can be classified as a novel memetic algorithm (MA), given that (1) it is the units of culture, or information, that are undergoing variation, transmission, and selection, very close to the original sense of memetics as it was introduced by Dawkins; and (2) this is different from existing MA, where the idea of memetics has been utilized as a means of local refinement by individual learning after classical global sampling of EA. The individual pieces of information are represented as simple semantic networks that are directed graphs of concepts and binary relations, going through variation by memetic versions of operators such as crossover and mutation, which utilize knowledge from commonsense knowledge bases. In evaluating this introductory work, as an interesting fitness measure, we focus on using the structure mapping theory of analogical reasoning from psychology to evolve pieces of information that are analogous to a given base information. Considering other possible fitness measures, the proposed representation and algorithm can serve as a computational tool for modeling memetic theories of knowledge, such as evolutionary epistemology and cultural selection theory.
Evolutionary Intelligence | 2015
Atilim Gunes Baydin; Ramon López de Mántaras; Santiago Ontañón
AbstractWe introduce a novel evolutionary algorithm (EA) with a semantic network-based representation. For enabling this, we establish new formulations of EA variation operators, crossover and mutation, that we adapt to work on semantic networks. The algorithm employs commonsense reasoning to ensure all operations preserve the meaningfulness of the networks, using ConceptNet and WordNet knowledge bases. The algorithm can be interpreted as a novel memetic algorithm (MA), given that (1) individuals represent pieces of information that undergo evolution, as in the original sense of memetics as it was introduced by Dawkins; and (2) this is different from existing MA, where the word “memetic” has been used as a synonym for local refinement after global optimization. For evaluating the approach, we introduce an analogical similarity-based fitness measure that is computed through structure mapping. This setup enables the open-ended generation of networks analogous to a given base network.
Journal of Machine Learning Research | 2015
Atilim Gunes Baydin; Barak A. Pearlmutter; Alexey Andreyevich Radul; Jeffrey Mark Siskind
international conference on artificial intelligence and statistics | 2016
Tuan Anh Le; Atilim Gunes Baydin; Frank D. Wood
arXiv: Neural and Evolutionary Computing | 2012
Atilim Gunes Baydin; Ramon López de Mántaras; Santiago Ontañón
arXiv: Learning | 2014
Atilim Gunes Baydin; Barak A. Pearlmutter
international conference on learning representations | 2018
Atilim Gunes Baydin; Robert Cornish; David Martinez Rubio; Mark W. Schmidt; Frank D. Wood
arXiv: Mathematical Software | 2016
Atilim Gunes Baydin; Barak A. Pearlmutter; Jeffrey Mark Siskind