Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Ericka Janet Rechy-Ramirez is active.

Publication


Featured researches published by Ericka Janet Rechy-Ramirez.


robotics and biomimetics | 2012

Head movements based control of an intelligent wheelchair in an indoor environment

Ericka Janet Rechy-Ramirez; Huosheng Hu; Klaus D. McDonald-Maier

This paper presents a user-friendly human machine interface (HMI) for hands-free control of an electric powered wheelchair (EPW). Its two operation modes are based on head movements: Mode 1 uses only one head movement to give the commands, and Mode 2 employs four head movements. An EEG device, namely Emotiv EPOC, has been deployed in this HMI to obtain the head movement information of users. The proposed HMI is compared with the joystick control of an EPW in an indoor environment. The experimental results show that Control Mode 2 can be implemented at a fast speed reliably, achieving a mean time of 67.90 seconds for the two subjects. However, Control Mode 1 has inferior performance, achieving a mean time of 153.20 seconds for the two subjects although it needs only one head movement. It is clear that the proposed HMI can be effectively used to replace the traditional joystick control for disabled and elderly people.


International Journal of Artificial Life Research | 2014

A Flexible Bio-Signal Based HMI for Hands-Free Control of an Electric Powered Wheelchair

Ericka Janet Rechy-Ramirez; Huosheng Hu

This paper presents a bio-signal based human machine interface HMI for hands-free control of an electric powered wheelchair. In this novel HMI, an Emotive EPOC sensor is deployed to detect facial expressions and head movements of users, which are then recognized and converted to four uni-modal control modes and two bi-modal control modes to operate the wheelchair. Nine facial expressions and up-down head movements have been defined and tested, so that users can select some of these facial expressions and head movements to form the six control commands. The proposed HMI is user-friendly and allows users to select one of available control modes according to their comfort. Experiments are conducted to show the feasibility and performance of the proposed HMI.


2006 15th International Conference on Computing | 2006

A Model and Language for Bitemporal Schema Versioning in Data Warehouses

Ericka Janet Rechy-Ramirez; Edgard Benítez-Guerrero

A data warehouse (DW) is a vast collection of historical data built to support multidimensional data analysis applications. In this context, an important problem is that of evolving the implementation (multidimensional, relational) schema of a DW to incorporate new requirements. This paper introduces a conceptual evolution model based on bitemporal versioning of multidimensional schemas, which allows one to modify the DW schema (a) in an implementation-independent manner, and (b) without affecting the operation of existing applications. It also presents a SQL-like language associated to this model, which offers expressions to create and change versions of multidimensional schemas


international conference on emerging security technologies | 2013

Bi-modal Human Machine Interface for Controlling an Intelligent Wheelchair

Ericka Janet Rechy-Ramirez; Huosheng Hu

This paper presents a bi-modal human machine interface (HMI) alternative for hands-free control of an electric powered wheelchair (EPW) by means of head movements and facial expressions. The head movements and the facial expressions are detected by using the gyroscope and the cognitiv suite of the Emotiv EPOC sensor, respectively. By employing the cognitiv suite, the user can choose his/her most comfortable facial expressions. Three head movements are used to stop the wheelchair and display the turning commands in the graphical interface (GI) of the HMI, while two facial expressions are employed to move forward the wheelchair and confirm the execution of the turning command displayed on the GI of the HMI. By doing this, the user is free of turning his/her head while the wheelchair is being controlled without the execution of an undesired command. Two subjects have tested the proposed HMI by operating a wheelchair in an indoor environment. Furthermore, five facial expressions have been tested in order to determine that the users can employ different facial expressions for executing the control commands on the wheelchair. The preliminary experiments reveal that our HMI is reliable for operating the wheelchair.


International Journal of Biomechatronics and Biomedical Robotics | 2014

Head movement and facial expression-based human-machine interface for controlling an intelligent wheelchair

Ericka Janet Rechy-Ramirez; Huosheng Hu

This paper presents a human machine interface (HMI) for hands-free control of an electric powered wheelchair (EPW) based on head movements and facial expressions detected by using the gyroscope and ‘cognitiv suite’ of an Emotiv EPOC device, respectively. The proposed HMI provides two control modes: 1) control mode 1 uses four head movements to display in its graphical user interface the control commands that the user wants to execute and one facial expression to confirm its execution; 2) control mode 2 employs two facial expressions for turning and forward motion, and one head movement for stopping the wheelchair. Therefore, both control modes offer hands-free control of the wheelchair. Two subjects have used the two control modes to operate a wheelchair in an indoor environment. Five facial expressions have been tested in order to determine if the users can employ different facial expressions for executing the commands. The experimental results show that the proposed HMI is reliable for operating the wheelchair safely.


ambient intelligence | 2018

Impact of commercial sensors in human computer interaction: a review

Ericka Janet Rechy-Ramirez; Antonio Marin-Hernandez; Homero Vladimir Rios-Figueroa

Nowadays, the communication gap between humans and computers might be reduced due to multimodal sensors available in the market. Therefore, it is important to know the specifications of these sensors and how they are being used in order to create human computer interfaces, which tackle complex tasks. The purpose of this paper is to review recent research regarding the up-to-date application areas of the following sensors: (1) Emotiv sensor, which identifies emotions, facial expressions, thoughts, and head movements from users through electroencephalography signals, (2) Leap motion controller, which recognizes hand and arm movements via vision techniques, (3) Myo armband, which identifies hand and arm movements using electromyography signals and inertial sensors, and (4) Oculus rift, which provides immersion into virtual reality to users. The application areas discussed in this manuscript go from assistive technology to virtual tours. Finally, a brief discussion regarding advantages and shortcomings of each sensor is presented.


mexican international conference on artificial intelligence | 2008

Comparing Three Simulated Strategies for Cancer Monitoring with Nanorobots

C. A. Piña-García; Ericka Janet Rechy-Ramirez; V. Angélica García-Vega

The use of nanorobots in medical applications, specifically cancer treatment, is a serious alternative to prevent this disease. Locating chemical sources and tracking them over time, are tasks where nanorobotics is an ideal candidate to accomplish them. We present a multiagent simulation of three bio-inspired strategies to find targets in fluid environments; including diverse conditions for example: noisy sensors, interference between agents and obstacles generated by the environment itself. Besides, we present a comparative analysis among the three strategies. The results show that nanorobotics used in cancer therapy needs to explore an extensive range of blind searching techniques without communication.


Advanced Robotics | 2018

Predicting collisions: time-to-contact forecasting based on probabilistic segmentation and system identification

Ángel Juan Sánchez-García; Homero Vladimir Rios-Figueroa; Hugues Garnier; Gustavo Quintana-Carapia; Ericka Janet Rechy-Ramirez; Antonio Marin-Hernandez

The Time-to-contact (TTC) estimate is mainly used in robotics navigation, in order to detect potential danger with obstacles in the environment. A key aspect in a robotic system is to perform its tasks promptly. Several approaches have been proposed to estimate reliable TTC in order to avoid collisions in real-time; nevertheless they are time consuming due to a calculation of scene characteristics in every frame. This paper presents an approach to estimate TTC using monocular vision based on the size change of the obstacles over time (); therefore, the robotic system may react promptly to its environment. Our approach collects information from few data of an obstacle, then the behavior of the movement is found through an online recursive modeling process, and finally, a forecasting of the upcoming positions is computed. We segment the obstacles using probabilistic hidden Markov chains. Our proposal is compared to a classical color segmentation approach using two real image sequences, each sequence is composed of 210 frames. Our results show that our proposal obtained smoother segmentations than a traditional color-based approach.


The Visual Computer | 2017

A human–computer interface for wrist rehabilitation: a pilot study using commercial sensors to detect wrist movements

Ericka Janet Rechy-Ramirez; Antonio Marin-Hernandez; Homero Vladimir Rios-Figueroa

Health conditions might cause muscle weakness and immobility in some body parts; hence, physiotherapy exercises play a key role in the rehabilitation. To improve the engagement during the rehabilitation process, we therefore propose a human–computer interface (serious game) in which five wrist movements (extension, flexion, pronation, supination and neutral) are detected via two commercial sensors (Leap motion controller and Myo armband). Leap motion provides data regarding positions of user’s finger phalanges through two infrared cameras, while Myo armband facilitates electromyography signal and inertial motion of user’s arm through its electrodes and inertial measurement unit. The main aim of this study is to explore the performance of these sensors on wrist movement recognition in terms of accuracy, sensitivity and specificity. Eight healthy participants played 5 times a proposed game with each sensor in one session. Both sensors reported over 85% average recognition accuracy in the five wrist movements. Based on t test and Wilcoxon signed-rank test, early results show that there were significant differences between Leap motion controller and Myo armband recognitions in terms of average sensitivities on extension (


Archive | 2014

Flexible Bi-modal Control Modes for Hands-Free Operation of a Wheelchair by Head Movements and Facial Expressions

Ericka Janet Rechy-Ramirez; Huosheng Hu

Collaboration


Dive into the Ericka Janet Rechy-Ramirez's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ana Luisa Solis Gonzalez-Cosio

National Autonomous University of Mexico

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge