Eduardo Iáñez
Universidad Miguel Hernández de Elche
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Eduardo Iáñez.
Expert Systems With Applications | 2012
J. L. Sirvent Blasco; Eduardo Iáñez; Andrés íbeda; José Maria Azorín
This paper describes a brain-computer interface (BCI) based on electroencephalography (EEG) that has been developed to assist disabled people. The BCI uses the evoked potentials paradigm (through P300 and N2PC waves detection), registering the EEG signals with 16 electrodes over the scalp. Three applications have been developed using this BCI paradigm. The first application is an Internet browser that allows to access to Internet and to control a computer. The second application allows controlling a robotic arm in order to manipulate objects. The third application is a basic communication tool that allows severe disabled people to interact with other people using basic commands related to emotions and needs. All the applications are composed by visual interfaces that show different options related to the application. These options are pseudo-randomly flickering in a screen. In order to select a specific command, the user must focus on the desired option. The BCI is able to obtain the desired option by detecting the P300 and N2PC waves produced as an automatic response of the brain to attended visual stimuli and finally classifying these signals. Different experiments with volunteers have been carried out in order to validate the applications. The experimental results obtained as well as the improvement achieved by using both types of evoked potentials are shown in the paper.
IEEE-ASME Transactions on Mechatronics | 2011
Andrés Úbeda; Eduardo Iáñez; José Maria Azorín
This paper describes a new portable and wireless interface based on electrooculography (EOG) aimed at people with severe motor disorders. This interface allows us detecting the movement of the eyes measuring the potential between the cornea and the retina. The interface uses five electrodes placed around the eyes of the user in order to register this potential. A processing algorithm of the EOG signals has been developed in order to detect the movement of the eyes. This interface has many advantages in comparison to commercial devices. It is a cheap and small sized device with USB compatibility. It does not need power supply from the network as it works with batteries and USB supply. Several experiments have been done to test the electronics of the interface. A first set of experiments has been performed to obtain the movement of the eyes of the user processing the signals provided by the interface. In addition, the interface has been used to control a real robot arm. The accuracy and time taken have been measured showing that the user is capable of controlling the robot using only his/her eyes with satisfactory results.
Neurocomputing | 2015
Enrique Hortal; Daniel Planelles; Álvaro Costa; Eduardo Iáñez; Andrés Úbeda; José Maria Azorín; Eduardo Fernández
Abstract Human–Machine Interfaces can be very useful to improve the quality of life of physically impaired users. In this work, a non-invasive spontaneous Brain–Machine Interface (BMI) has been designed to control a robot arm through the mental activity of the users. This BMI uses the classification of four mental tasks in order to manage the movements of the robot. The high accuracy in the classification of these tasks (around 70%) allows a quick accomplishment of the experiment designed, even with the low signal-to-noise ratio of this kind of signals. The experiment consists of reaching four points in the workspace of an industrial robot in the established order. After a brief training, the volunteers are able to control the robot in a real time activity. The real time test shows that the system can be applied to do more complex activity such as pick and place tasks if a supplementary system is added. These interfaces are very adequate in the control of rehabilitation or assistance systems for people suffering from motor impairment.
Computer Methods and Programs in Biomedicine | 2014
Enrique Hortal; Andrés íbeda; Eduardo Iáñez; José Maria Azorín
In this paper, a non-invasive spontaneous Brain-Machine Interface (BMI) is used to control the movement of a planar robot. To that end, two mental tasks are used to manage the visual interface that controls the robot. The robot used is a PupArm, a force-controlled planar robot designed by the nBio research group at the Miguel Hernández University of Elche (Spain). Two control strategies are compared: hierarchical and directional control. The experimental test (performed by four users) consists of reaching four targets. The errors and time used during the performance of the tests are compared in both control strategies (hierarchical and directional control). The advantages and disadvantages of each method are shown after the analysis of the results. The hierarchical control allows an accurate approaching to the goals but it is slower than using the directional control which, on the contrary, is less precise. The results show both strategies are useful to control this planar robot. In the future, by adding an extra device like a gripper, this BMI could be used in assistive applications such as grasping daily objects in a realistic environment. In order to compare the behavior of the system taking into account the opinion of the users, a NASA Tasks Load Index (TLX) questionnaire is filled out after two sessions are completed.
Robotics and Autonomous Systems | 2012
Eduardo Iáñez; Andrés íbeda; José Maria Azorín; Carlos Perez-Vidal
This paper describes an assistive robot application that combines a portable wireless interface based on electrooculography (EOG) and Radiofrequency Identification (RFID) technology. This assistive application is aimed at handicapped users who suffer from a severe motor disability. To that end, a realistic application has been designed. It consists of an environment in which users can bring a glass and a water bottle closer with only the help of their eye movement using a real robot arm. RFID will be used as a support to the EOG interface in a shared control architecture by storing information of the objects in tags placed on the scene. Five volunteers tested the assistive robot application. The results obtained show that all of them were able to finish the tests in a suitable time and the results improved with practice and training. This proves that the assistive robot application can be a feasible way to help handicapped users.
international conference of the ieee engineering in medicine and biology society | 2011
Eduardo Iáñez; Andrés Úbeda; José Maria Azorín
This paper describes a multimodal interface that combines a Brain-Computer Interface (BCI) with an electrooculography (EOG) interface. The non-invasive spontaneous BCI registers the electrical brain activity through surface electrodes. The EOG interface detects the eye movements through electrodes placed on the face around the eyes. Both kind of signals are registered together and processed to obtain the mental task that the user is thinking and the eye movement performed by the user. Both commands (mental task and eye movement) are combined in order to move a dot in a graphic user interface (GUI). Several experimental tests have been made where the users perform a trajectory to get closer to some targets. To perform the trajectory the user moves the dot in a plane with the EOG interface and using the BCI the dot changes its height.
Robotics and Autonomous Systems | 2013
Andrés Úbeda; Eduardo Iáñez; José Maria Azorín
Abstract This paper describes a shared control architecture combining a Brain–Machine Interface (BMI) with Radio-frequency Identification (RFID) technology to control a robot arm in pick and place operations. A non-invasive spontaneous BMI capable of distinguishing between three different mental tasks has been designed. Using the BMI, the user can control the robot in order to perform complex actions (e.g. pick and place operations). RFID tags have been placed in the experimental setup to give information about the position of the objects in the scene. With this information, the user is able to pick and place the objects with a robot arm by performing simple commands: move left, move right, pick or place, with the only help of the BMI. Four volunteers have successfully controlled the robot arm, and time and accuracy have been measured.
practical applications of agents and multi agent systems | 2010
José L. Sirvent; José Maria Azorín; Eduardo Iáñez; Andrés Úbeda; Eduardo Fernández
This paper describes the implementation of a Brain-Computer Interface (BCI) for controlling Internet browsing. The system uses electroencephalographic (EEG) signals to control the computer by evoked potentials through the P300 paradigm. This way, using visual stimulus, the user is able to control the Internet navigation via a virtual mouse and keyboard. The system has been developed under the BCI2000 platform. This paper also shows the experimental results obtained by different users.
Robotics and Autonomous Systems | 2015
Enrique Hortal; Eduardo Iáñez; Andrés Úbeda; Carlos Perez-Vidal; José Maria Azorín
This paper presents a multimodal Human-Machine Interface system that combines an Electrooculography Interface and a Brain-Machine Interface. This multimodal interface has been used to control a robotic arm to perform pick and place tasks in a three dimensional environment. Five volunteers were asked to pick two boxes and place them in different positions. The results prove the feasibility of the system in the performance of pick and place tasks. By using the multimodal interface, all the volunteers (even naive users) were able to successfully move two objects within a satisfactory period of time with the help of the robotic arm. A multimodal HMI to perform pick and place tasks with a robotic arm is presented.The EOG interface is applied to control planar movements and to operate the gripper.The BMI is used to control the height of the gripper through two mental tasks.The system had been tested by five healthy subjects.The results prove the feasibility of the system in the performance of these tasks.
Neurocomputing | 2013
Andrés Úbeda; Eduardo Iáñez; José Maria Azorín; José María Sabater; Eduardo Fernández
Abstract This paper describes a new method of classification for a Brain–Computer Interface (BCI) based on a normalized cross-correlation of EEG maps which represent the mental activity of the brain. An optimization protocol has been designed to choose the main parameters of the classifier in order to increase the accuracy on the classification. This protocol has been tested with the registers provided by IDIAP Research Institute for BCI Competition 2003. Three different mental tasks based on motor imagery are performed in these sessions. The data have been processed and tested with the classifier to obtain the optimal success rate and reliability. To that end, the optimization protocol has been applied to select the suitable parameters for the classification. The results are very satisfactory and prove that this kind of classification can be successfully introduced in a real time BCI.