Antonio Rios-Navarro
University of Seville
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Antonio Rios-Navarro.
distributed computing and artificial intelligence | 2016
Elena Cerezuela-Escudero; Antonio Rios-Navarro; Juan Pedro Dominguez-Morales; Ricardo Tapiador-Morales; Daniel Gutierrez-Galan; C. Martín-Cañal; Alejandro Linares-Barranco
The study and monitoring of wildlife has always been a subject of great interest. Studying the behavior of wildlife animals is a very complex task due to the difficulties to track them and classify their behaviors through the collected sensory information. Novel technology allows designing low cost systems that facilitate these tasks. There are currently some commercial solutions to this problem; however, it is not possible to obtain a highly accurate classification due to the lack of gathered information. In this work, we propose an animal behavior recognition, classification and monitoring system based on a smart collar device provided with inertial sensors and a feed-forward neural network or Multi-Layer Perceptron (MLP) to classify the possible animal behavior based on the collected sensory information. Experimental results over horse gaits case study show that the recognition system achieves an accuracy of up to 95.6%.
international conference on computer information and telecommunication systems | 2015
Ricardo Tapiador-Morales; Antonio Rios-Navarro; Angel Jiménez-Fernandez; Juan Pedro Dominguez-Morales; Alejandro Linares-Barranco
Sensors Network is an integration of multiples sensors in a system to collect information about different environment variables. Monitoring systems allow us to determine the current state, to know its behavior and sometimes to predict what it is going to happen. This work presents a monitoring system for semi-wild animals that get their actions using an IMU (inertial measure unit) and a sensor fusion algorithm. Based on an ARM-CortexM4 microcontroller this system sends data using ZigBee technology of different sensor axis in two different operations modes: RAW (logging all information into a SD card) or RT (real-time operation). The sensor fusion algorithm improves both the precision and noise interferences.
international conference on artificial neural networks | 2016
Juan Pedro Dominguez-Morales; Angel Jiménez-Fernandez; Antonio Rios-Navarro; Elena Cerezuela-Escudero; Daniel Gutierrez-Galan; M. Domínguez-Morales; Gabriel Jiménez-Moreno
Audio classification has always been an interesting subject of research inside the neuromorphic engineering field. Tools like Nengo or Brian, and hardware platforms like the SpiNNaker board are rapidly increasing in popularity in the neuromorphic community due to the ease of modelling spiking neural networks with them. In this manuscript a multilayer spiking neural network for audio samples classification using SpiNNaker is presented. The network consists of different leaky integrate-and-fire neuron layers. The connections between them are trained using novel firing rate based algorithms and tested using sets of pure tones with frequencies that range from 130.813 to 1396.91 Hz. The hit rate percentage values are obtained after adding a random noise signal to the original pure tone signal. The results show very good classification results (above 85 % hit rate) for each class when the Signal-to-noise ratio is above 3 decibels, validating the robustness of the network configuration and the training step.
Neurocomputing | 2018
Daniel Gutierrez-Galan; Juan Pedro Dominguez-Morales; Elena Cerezuela-Escudero; Antonio Rios-Navarro; Ricardo Tapiador-Morales; Manuel Rivas-Perez; M. Domínguez-Morales; Angel Jiménez-Fernandez; Alejandro Linares-Barranco
Recent biological studies have focused on understanding animal interactions and welfare. To help biologists to obtain animals behavior information, resources like wireless sensor networks are needed. Moreover, large amounts of obtained data have to be processed off-line in order to classify different behaviors. There are recent research projects focused on designing monitoring systems capable of measuring some animals parameters in order to recognize and monitor their gaits or behaviors. However, network unreliability and high power consumption have limited their applicability.In this work, we present an animal behavior recognition, classification and monitoring system based on a wireless sensor network and a smart collar device, provided with inertial sensors and an embedded multi-layer perceptron-based feed-forward neural network, to classify the different gaits or behaviors based on the collected information. In similar works, classification mechanisms are implemented in a server (or base station). The main novelty of this work is the full implementation of a reconfigurable neural network embedded into the animals collar, which allows a real-time behavior classification and enables its local storage in SD memory. Moreover, this approach reduces the amount of data transmitted to the base station (and its periodicity), achieving a significantly improving battery life. The system has been simulated and tested in a real scenario for three different horse gaits, using different heuristics and sensors to improve the accuracy of behavior recognition, achieving a maximum of 81%.
international symposium on circuits and systems | 2016
Diederik Paul Moeys; Tobias Delbrück; Antonio Rios-Navarro; Alejandro Linares-Barranco
This paper describes the software and FPGA implementation of a Retinal Ganglion Cell model which detects moving objects. It is shown how this processing, in conjunction with a Dynamic Vision Sensor as its input, can be used to extrapolate information about object position. Software-wise, a system based on an array of these of RGCs has been developed in order to obtain up to two trackers. These can track objects in a scene, from a still observer, and get inhibited when saccadic camera motion happens. The entire processing takes on average 1000 ns/event. A simplified version of this mechanism, with a mean latency of 330 ns/event, at 50 MHz, has also been implemented in a Spartan6 FPGA.
international conference on artificial neural networks | 2016
Antonio Rios-Navarro; Juan Pedro Dominguez-Morales; Ricardo Tapiador-Morales; M. Domínguez-Morales; Angel Jiménez-Fernandez; Alejandro Linares-Barranco
The study and monitoring of the behavior of wildlife has always been a subject of great interest. Although many systems can track animal positions using GPS systems, the behavior classification is not a common task. For this work, a multi-sensory wearable device has been designed and implemented to be used in the Donana National Park in order to control and monitor wild and semi-wild life animals. The data obtained with these sensors is processed using a Spiking Neural Network (SNN), with Address-Event-Representation (AER) coding, and it is classified between some fixed activity behaviors. This works presents the full infrastructure deployed in Donana to collect the data, the wearable device, the SNN implementation in SpiNNaker and the classification results.
international conference on event based control communication and signal processing | 2015
Antonio Rios-Navarro; Elena Cerezuela-Escudero; M. Domínguez-Morales; Angel Jiménez-Fernandez; Gabriel Jiménez-Moreno; Alejandro Linares-Barranco
Multisensory integration is commonly used in various robotic areas to collect more environmental information using different and complementary types of sensors. Neuromorphic engineers mimics biological systems behavior to improve systems performance in solving engineering problems with low power consumption. This work presents a neuromorphic sensory integration scenario for measuring the rotation frequency of a motor using an AER DVS128 retina chip (Dynamic Vision Sensor) and a stereo auditory system on a FPGA completely event-based. Both of them transmit information with Address-Event-Representation (AER). This integration system uses a new AER monitor hardware interface, based on a Spartan-6 FPGA that allows two operational modes: real-time (up to 5 Mevps through USB2.0) and data logger mode (up to 20Mevps for 33.5Mev stored in onboard DDR RAM). The sensory integration allows reducing prediction error of the rotation speed of the motor since audio processing offers a concrete range of rpm, while DVS can be much more accurate.
international conference on event based control communication and signal processing | 2016
Antonio Rios-Navarro; Juan Pedro Dominguez-Morales; Ricardo Tapiador-Morales; Daniel Gutierrez-Galan; Angel Jiménez-Fernandez; Alejandro Linares-Barranco
Neuromorphic systems are engineering solutions that take inspiration from biological neural systems. They use spike-or event-based representation and codification of the information. This codification allows performing complex computations, filters, classifications and learning in a pseudo-simultaneous way. Small incremental processing is done per event, which shows useful results with very low latencies. Therefore, developing this kind of systems requires the use of specialized tools for debugging and testing those flows of events. This paper presents a set of logic implementations for FPGA that assists on the development of event-based systems and their debugging. Address-Event-Representation (AER) is a communication protocol for transferring events/spikes between bio-inspired chips/systems. Real-time monitoring and sequencing, logging and playing back long sequences of events/spikes to and from memory; and several merging and splitting ports are the main requirements when developing these systems. These functionalities and implementations are explained and tested in this work. The logic has been evaluated in an Opal-Kelly XEM6010 acting as a daughter board for the AER-Node platform. It has a peak rate of 20Mevps when logging and a total of 32Mev of logging capacity on DDR when debugging an AER system in the AER-Node or a set of them connected in daisy chain.
international symposium on circuits and systems | 2015
Antonio Rios-Navarro; Elena Cerezuela-Escudero; M. Domínguez-Morales; Angel Jiménez-Fernandez; Gabriel Jiménez-Moreno; Alejandro Linares-Barranco
Multisensory integration is commonly used in various robotic areas to collect much more information from an environment using different and complementary types of sensors. This demonstration presents a scenario where the motor rotation frequency is obtained using an AER DVS128 retina chip (Dynamic Vision Sensor) and a frequency decomposer auditory system on a FPGA that mimics a biological cochlea. Both of them are spike-based sensors with Address-Event-Representation (AER) outputs. A new AER monitor hardware interface, based on a Spartan-6 FPGA, allows two operational modes: real-time (up to 5 Mevps through USB2.0) and off-line mode (up to 20 Mevps and 33.5 Mev stored in DDR RAM). The sensory integration allows the bio-inspired cochlea limit to provide a concrete range of rpm approaches, which are obtained by the silicon retina.
Entropy | 2018
Alejandro Linares-Barranco; Hongjie Liu; Antonio Rios-Navarro; Francisco Gomez-Rodriguez; Diederik Paul Moeys; Tobi Delbruck
Taking inspiration from biology to solve engineering problems using the organizing principles of biological neural computation is the aim of the field of neuromorphic engineering. This field has demonstrated success in sensor based applications (vision and audition) as well as in cognition and actuators. This paper is focused on mimicking the approaching detection functionality of the retina that is computed by one type of Retinal Ganglion Cell (RGC) and its application to robotics. These RGCs transmit action potentials when an expanding object is detected. In this work we compare the software and hardware logic FPGA implementations of this approaching function and the hardware latency when applied to robots, as an attention/reaction mechanism. The visual input for these cells comes from an asynchronous event-driven Dynamic Vision Sensor, which leads to an end-to-end event based processing system. The software model has been developed in Java, and computed with an average processing time per event of 370 ns on a NUC embedded computer. The output firing rate for an approaching object depends on the cell parameters that represent the needed number of input events to reach the firing threshold. For the hardware implementation, on a Spartan 6 FPGA, the processing time is reduced to 160 ns/event with the clock running at 50 MHz. The entropy has been calculated to demonstrate that the system is not totally deterministic in response to approaching objects because of several bioinspired characteristics. It has been measured that a Summit XL mobile robot can react to an approaching object in 90 ms, which can be used as an attentional mechanism. This is faster than similar event-based approaches in robotics and equivalent to human reaction latencies to visual stimulus.