Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Ricardo Ron-Angevin is active.

Publication


Featured researches published by Ricardo Ron-Angevin.


Neuroscience Letters | 2009

Brain–computer interface: Changes in performance using virtual reality techniques

Ricardo Ron-Angevin; A. Diaz-Estrella

The ability to control electroencephalographic (EEG) signals when different mental tasks are carried out would provide a method of communication for people with serious motor function problems. This system is known as a brain-computer interface (BCI). Due to the difficulty of controlling ones own EEG signals, a suitable training protocol is required to motivate subjects, as it is necessary to provide some type of visual feedback allowing subjects to see their progress. Conventional systems of feedback are based on simple visual presentations, such as a horizontal bar extension. However, virtual reality is a powerful tool with graphical possibilities to improve BCI-feedback presentation. The objective of the study is to explore the advantages of the use of feedback based on virtual reality techniques compared to conventional systems of feedback. Sixteen untrained subjects, divided into two groups, participated in the experiment. A group of subjects was trained using a BCI system, which uses conventional feedback (bar extension), and another group was trained using a BCI system, which submits subjects to a more familiar environment, such as controlling a car to avoid obstacles. The obtained results suggest that EEG behaviour can be modified via feedback presentation. Significant differences in classification error rates between both interfaces were obtained during the feedback period, confirming that an interface based on virtual reality techniques can improve the feedback control, specifically for untrained subjects.


Neurocomputing | 2013

Audio-cued motor imagery-based brain-computer interface: Navigation through virtual and real environments

Francisco Velasco-Álvarez; Ricardo Ron-Angevin; Leandro da Silva-Sauer; Salvador Sancha-Ros

The aim of this work is to provide a navigation paradigm that could be used to control a wheelchair through a brain-computer interface (BCI). In such a case, it is desirable to control the system without a graphical interface so that it will be useful for people without gaze control. Thus, an audio-cued paradigm with several navigation commands is proposed. In order to reduce the probability of misclassification, the BCI operates with only two mental tasks: relaxed state versus imagination of right hand movements; the use of motor imagery for navigation control is not yet extended among the auditory BCIs. Two experiments are described: in the first one, users practice the switch from a graphical to an audio-cued interface with a virtual wheelchair; in the second one, they change from virtual to real environments. The obtained results support the use of the proposed interface to control a real wheelchair without the need of a screen to provide visual stimuli or feedback.


Presence: Teleoperators & Virtual Environments | 2010

Free virtual navigation using motor imagery through an asynchronous brain--computer interface

Francisco Velasco-Álvarez; Ricardo Ron-Angevin; María José Blanca-Mena

In this paper, an asynchronous braincomputer interface is presented that enables the control of a wheelchair in virtual environments using only one motor imagery task. The control is achieved through a graphical intentional control interface with three navigation commands (move forward, turn right, and turn left) which are displayed surrounding a circle. A bar is rotating in the center of the circle, so it points successively to the three possible commands. The user can, by motor imagery, extend this bar length to select the command at which the bar is pointing. Once a command is selected, the virtual wheelchair moves in a continuous way, so the user controls the length of the advance or the amplitude of the turns. Users can voluntarily switch from this interface to a noncontrol interface (and vice versa) when they do not want to generate any command. After performing a cue-based feedback training, three subjects carried out an experiment in which they had to navigate through the same fixed path to reach an objective. The results obtained support the viability of the system.


ambient intelligence | 2009

Asynchronous Brain-Computer Interface to Navigate in Virtual Environments Using One Motor Imagery

Francisco Velasco-Álvarez; Ricardo Ron-Angevin

A Brain-Computer Interface (BCI) application focused on the control of a wheelchair must consider the danger which a wrong command would involve in a real situation. Virtual reality is a suitable tool to provide subjects with the opportunity to train and test the application before using it under real conditions. Recent studies aimed at such control let the subject decide the timing of the interaction, those are the so-called asynchronous BCI. One way to reduce the probability of misclassification is to achieve control with only two different mental tasks. The system presented in this paper combines the mentioned advantages in a paradigm that enables the control of a virtual wheelchair with three commands: move forward, turn left and turn right. The results obtained over three subjects support the viability of the proposed system.


international conference on artificial neural networks | 2011

Audio-cued SMR brain-computer interface to drive a virtual wheelchair

Francisco Velasco-Álvarez; Ricardo Ron-Angevin; Leandro da Silva-Sauer; Salvador Sancha-Ros; María José Blanca-Mena

In this work, an electroencephalographic analysis-based, self-paced (asynchronous) brain-computer interface (BCI) is proposed to control a virtual wheelchair using three different navigation commands: turn right, turn left and move forward. In order to reduce the probability of misclassification, the BCI is to be controlled with only two mental tasks (relaxed state versus imagination of right hand movements) using an audio-cued interface. Six healthy subjects participated in the experiment. After two training sessions controlling a wheelchair in a virtual environment using both a visual and auditory interface, all subjects successfully controlled the wheelchair in the last session, where the interface was only auditory. The obtained results support the use of the proposed interface to control a real wheelchair without the need of a screen to provide visual stimuli or feedback.


ambient intelligence | 2009

The Training Issue in Brain-Computer Interface: A Multi-disciplinary Field

Ricardo Ron-Angevin; Francisco J. Pelayo

A tough question to address would be who can be considered the inventor of the Brain-computer interface. Many researchers have contributed to the evolution of this concept, from the basis conception of the neurons, synaptic connections and rudimentary EEG measurement systems up to the use of virtual reality environments with modern wireless devices. In the middle, series of devoted people with the honourable aim of provide a better quality of life to disabled patients. In the course of this evolution a key point is the use of efficient techniques of mutual training for both the patient and the system. In this paper we introduce the multi-disciplinary nature of the Brain-computer interfaces, for later focus on training techniques used in those based in the sensorimotor rhythm.


international conference on artificial neural networks | 2011

Auditory brain-computer interfaces for complete locked-in patients

M. A. Lopez-Gordo; Ricardo Ron-Angevin; Francisco Pelayo Valle

Brain-computer interfaces (BCIs) are intended for people unable to do any muscular movement such as complete locked-in patients. Most of the BCIs make use of visual interaction with the user, either in form of stimulation or biofeedback. However, visual BCIs challenge the ultimate use of BCIs because they require the subjects to gaze, explore and coordinate the eyes using their muscles, thus ruling out complete locked-in patients. Despite auditory BCIs overcome the problem of the visuals, there are not many examples of them in the BCI literature. In this paper we review the research and main contributions to auditory BCIs, and compare them with visual BCIs, especially to communicate with complete locked-in patients.


ieee international conference on rehabilitation robotics | 2011

A two-class self-paced BCI to control a robot in four directions

Ricardo Ron-Angevin; Francisco Velasco-Álvarez; Salvador Sancha-Ros; Leandro da Silva-Sauer

In this work, an electroencephalographic analysis-based, self-paced (asynchronous) brain-computer interface (BCI) is proposed to control a mobile robot using four different navigation commands: turn right, turn left, move forward and move back. In order to reduce the probability of misclassification, the BCI is to be controlled with only two mental tasks (relaxed state versus imagination of right hand movements), using an audio-cued interface. Four healthy subjects participated in the experiment. After two sessions controlling a simulated robot in a virtual environment (which allowed the user to become familiar with the interface), three subjects successfully moved the robot in a real environment. The obtained results show that the proposed interface enables control over the robot, even for subjects with low BCI performance.


Journal of Neural Engineering | 2016

Review of real brain-controlled wheelchairs

Álvaro Fernández-Rodríguez; Francisco Velasco-Álvarez; Ricardo Ron-Angevin

This paper presents a review of the state of the art regarding wheelchairs driven by a brain-computer interface. Using a brain-controlled wheelchair (BCW), disabled users could handle a wheelchair through their brain activity, granting autonomy to move through an experimental environment. A classification is established, based on the characteristics of the BCW, such as the type of electroencephalographic signal used, the navigation system employed by the wheelchair, the task for the participants, or the metrics used to evaluate the performance. Furthermore, these factors are compared according to the type of signal used, in order to clarify the differences among them. Finally, the trend of current research in this field is discussed, as well as the challenges that should be solved in the future.


international ieee/embs conference on neural engineering | 2015

Wheelchair navigation with an audio-cued, two-class motor imagery-based brain-computer interface system

Sergio Varona-Moya; Francisco Velasco-Álvarez; Salvador Sancha-Ros; Álvaro Fernández-Rodríguez; María J. Blanca; Ricardo Ron-Angevin

Driving a real wheelchair by means of a brain-computer interface (BCI) system must be a reliable option for locked-in patients. Such navigation should also be autonomous, i.e., not depending on a ground chart. In this work we test the feasibility of driving a customized robotic wheelchair with a BCI system that our group has used in previous studies with virtual and real mobile robots. The results obtained from a sample of three healthy naïve participants suggest that it is an effective option, which could ultimately provide locked-in patients with greater autonomy and quality of life.

Collaboration


Dive into the Ricardo Ron-Angevin's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jean Marc André

Centre national de la recherche scientifique

View shared research outputs
Researchain Logo
Decentralizing Knowledge