Maher Abujelala
University of Texas at Arlington
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Maher Abujelala.
pervasive technologies related to assistive environments | 2016
Maher Abujelala; Cheryl Abellanoza; Aayush Sharma; Fillia Makedon
Previous studies that involve measuring EEG, or electroencephalograms, have mainly been experimentally-driven projects; for instance, EEG has long been used in research to help identify and elucidate our understanding of many neuroscientific, cognitive, and clinical issues (e.g., sleep, seizures, memory). However, advances in technology have made EEG more accessible to the population. This opens up lines for EEG to provide more information about brain activity in everyday life, rather than in a laboratory setting. To take advantage of the technological advances that have allowed for this, we introduce the Brain-EE system, a method for evaluating user engaged enjoyment that uses a commercially available EEG tool (Muse). During testing, fifteen participants engaged in two tasks (playing two different video games via tablet), and their EEG data were recorded. The Brain-EE system supported much of the previous literature on enjoyment; increases in frontal theta activity strongly and reliably predicted which game each individual participant preferred. We hope to develop the Brain-EE system further in order to contribute to a wide variety of applications (e.g., usability testing, clinical or experimental applications, evaluation methods, etc.).
pervasive technologies related to assistive environments | 2015
Maher Abujelala; Alexandros Lioulemes; Paul Sassaman; Fillia Makedon
The estimation of human arm forces is an important factor in physical therapy, especially in robotic-aided physical therapy. Force measurements reveal the rehabilitation progress of patients with poor upper extremity motor function. In this work, we managed to record and analyse the upper arm forces of the patient while executing upper extremity exercises. Our analysis of these forces allows the robot to identify the patients motion capability and apply active motion control to the patients arm when their arms forces deviate from the desired motion set-up by the therapist.
international conference on virtual, augmented and mixed reality | 2017
Ashwin Ramesh Babu; Akilesh Rajavenkatanarayanan; Maher Abujelala; Fillia Makedon
Extensive research has been carried out in using computer-based techniques to train and prepare workers for various industry positions. Most of this research focuses on how to best enable the workers to perform a type of task safely and efficiently. In fact, many of the accidents in manufacturing and construction environments are due to the lack of proper training needed for employees. In this study, we compare the impact of three types of training approaches on the planning and problem-solving abilities of a trainee while he/she performs the Towers of Hanoi (TOH) task. The three approaches are (a) traditional (with a human trainer), (b) gamification (game-based training simulation), and (c) computer-aided training. The aim of this study is to evaluate a worker’s level of functioning and problem-solving skills based on a specific training approach. Exact assessment of functional capacities is an important prerequisite to ensure effective and personalized training. The study uses workplace simulation to collect different types of performance data and assess the impact of these training approaches.
ieee symposium series on computational intelligence | 2016
Konstantinos Tsiakas; Maher Abujelala; Alexandros Lioulemes; Fillia Makedon
In this paper, we propose an Interactive Learning and Adaptation framework for Human-Robot Interaction in a vocational setting. We show how Interactive Reinforcement Learning (RL) techniques can be applied to such HRI applications in order to promote effective interaction. We present the framework by showing two different use cases in a vocational setting. In the first use case, the robot acts as a trainer, assisting the user while the user is solving the Towers of Hanoi problem. In the second use case, a robot and a human operator collaborate towards solving a synergistic construction or assembly task. We show how RL is used in the proposed framework and discuss its effectiveness in the two different vocational use cases, the Robot Assisted Training and the Human-Robot Collaboration case.
international conference on digital human modeling and applications in health, safety, ergonomics and risk management | 2017
Varun Kanal; Maher Abujelala; Srujana Gattupalli; Vassilis Athitsos; Fillia Makedon
This paper describes the APSEN system, a pre-screening tool for detecting sleep apnea in a home environment. The system was designed and evaluated in two parts; the apnea detection using SpO2 and the posture detection using IR images. The two parts can work together or independently. During the preliminary study, the apnea detection algorithm was evaluated using an online database, and the right algorithms for detecting the sleep posture were determined. In the overnight study, both of the subsystems were tested on 10 subjects. The average accuracy for the apnea detection algorithm was 71.51% for apnea conditions, and 98.68% for normal conditions. For the posture detection algorithms, during the overnight study, the average accuracies are 74.91% and 89.71% for SVM and CNN, respectively. The results represented in the paper indicate that the APSEN system could be used to detect apnea and postural apnea in a home environment.
pervasive technologies related to assistive environments | 2016
Alexandros Lioulemes; Michalis Papakostas; Shawn N. Gieser; Theodora Toutountzi; Maher Abujelala; Sanika Gupta; Christopher Collander; Christopher McMurrough; Fillia Makedon
In this paper, we present a survey of emerging technologies for non-invasive human activity, behavior, and physiological sensing. The survey focuses on technologies that are close to entering the commercial market, or have only recently become available. We intend for this survey to give researchers in any field relevant to human data collection an overview of currently accessible devices and sensing modalities, their capabilities, and how the technologies will mature with time.
pervasive technologies related to assistive environments | 2018
Michalis Papakostas; Konstantinos Tsiakas; Maher Abujelala; Morris D. Bell; Fillia Makedon
Recent research has shown that hundreds of millions of workers worldwide may lose their jobs to robots and automation by 2030, impacting over 40 developed and emerging countries and affecting more than 800 types of jobs. While automation promises to increase productivity and relieve workers from tedious or heavy-duty tasks, it can also widen the gap, leaving behind workers who lack automation training. In this project, we propose to build a technologically based, personalized vocational cyberlearning training system, where the user is assessed while immersed in a simulated workplace/factory task environment, and the system collecting and analyzing multisensory cognitive, behavioral and physiological data. Such a system, will produce recommendations to support targeted vocational training decision-making. The focus is on collecting and analyzing specific neurocognitive functions that include, working memory, attention, cognitive overload and cognitive flexibility. Collected data are analyzed to reveal, in iterative fashion, relationships between physiological and cognitive performance metrics, and how these relate to work-related behavioral patterns that require special vocational training.
pervasive technologies related to assistive environments | 2018
Maher Abujelala; Sanika Gupta; Fillia Makedon
In recent years robots have been replacing human workers in many industries. As robot manufacturing becomes more optimized and cost effective, the use of robots is going to evolve in various fields. Thus, it is essential to prepare workers to operate robots and work collaboratively with them side by side. In this paper, we are proposing a human-robot collaborative task to assess human performance in experimental, industrial setup. The task is designed in both virtual and physical environments.
international conference on learning and collaboration technologies | 2018
Konstantinos Tsiakas; Maher Abujelala; Akilesh Rajavenkatanarayanan; Fillia Makedon
Robot Assisted Training (RAT) systems have been successfully deployed to provide assistance during a training task, promoting an efficient interaction with the user. Personalization can improve the efficiency of the interaction and thus enhance the effects of the training session. Personalization can be achieved through user skill assessment in order to choose an appropriate robot behavior that matches user abilities and needs. Graphical User Interfaces have been used to enable human supervisors to control robots and guide the interaction in RAT-based systems. This work focuses on how such interfaces can be used to enable human supervisor users (e.g., therapists) to assess user skills during a robot-based cognitive task. In this study, we investigate how different visualization features affect decision making and efficiency, towards the design of an intelligent and informative interface.
pervasive technologies related to assistive environments | 2017
Karthikeyan Rajamani; Adhavann Ramalingam; Srinivas Bavisetti; Maher Abujelala
Brain waves indicate the cognitive state of a person. A relative high right frontal alpha activity compared to the left shows an active cognitive state which is relaxed and alert, conducive for cognition. A persons engagement in an activity can be measured by the ratio of Beta to Alpha and Theta waves. In our experiment, we aimed to increase some of the cognitive activities, by combining video and audio brain entertainment, neural feedback gaming, and asymmetric protocol. Our hypothesis is that brain entertainment and neural feedback can enhance a persons cognitive abilities. We conducted the experiment with six participants. During the experiment the brain activity of the users was logged and used for measuring engagement and for feedback for the game, using the EEG headband.