Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Mohamed Abdur Rahman is active.

Publication


Featured researches published by Mohamed Abdur Rahman.


international conference on multimedia retrieval | 2013

Multimedia interactive therapy environment for children having physical disabilities

Mohamed Abdur Rahman; Ahmad M. Qamar; Mohamed Ahmed; M. Ataur Rahman; Saleh M. Basalamah

In this paper, we present an interactive multimedia environment that can be used to effectively complement the role of a therapist in the process of rehabilitation for disabled children. We use Microsoft Kinect 3D depth sensing camera with the online Second Life virtual world to record rehabilitation exercises performed by a physiotherapist or a disabled child. The exercise session can be played synchronously in Second Life. The physical activities of the users are synchronized with their virtual counterparts in the Second Life. The exercises can be recorded as well and made available for downloading to facilitate offline playback. A disabled child can follow the exercise at home in the absence of the therapist, since the system can provide visual guidance for performing the exercise in the right manner. Using the proposed system, parents at home can also assist the disabled child in performing therapy sessions in the absence of a therapist. Following the suggestions of therapists, the developed prototype can track several gestures of children who have mobility problems. Using a single Kinect device, we can capture high resolution joint movement of the body, without the need for any complicated hardware set up. The initial joint-based angular measurements show promising potential of our prototype to be deployed in real physiotherapy sessions.


international conference on multimedia retrieval | 2015

A Multi-Sensory Gesture-Based Occupational Therapy Environment for Controlling Home Appliances

Ahmad M. Qamar; Ahmed Riaz Khan; Syed Osama Husain; Mohamed Abdur Rahman; Saleh Baslamah

The proliferation of networked home appliances, coupled with the popularity of low cost gesture detection sensors has made it possible to create smart home environments where users can manipulate devices of daily use through gestures. In this paper, we present a multi-sensory environment that allows a disabled person to control actual devices around the house that are presented to her in a mixed reality environment, as part of occupational therapy exercises. An analytical engine processes the motion data collected by gesture recognition sensors, measures recovery metrics, such as joint range of motion and speed of movement, etc. and presents the results to the therapist or the patient in the form of graphs and exercise statistics. We have incorporated intuitive 2D and 3D user interfaces into the environment to make the therapy experience more engaging and immersive for the disabled people. From the multimedia occupational therapy exercise data analysis, a therapist can get determine the ability of a patient to perform daily life activities without the help of others.


international conference on intelligent systems, modelling and simulation | 2014

Adding Inverse Kinematics for Providing Live Feedback in a Serious Game-Based Rehabilitation System

Ahmad M. Qamar; Mohamed Abdur Rahman; Saleh M. Basalamah

In this paper, we present a serious game-based framework for providing live feedback to a patient performing a rehabilitation therapy for the purpose of assisting a patient in correct therapeutic steps and obtaining high quality therapy data for further analysis. The game environment uses forward kinematics to receive the live sensory data from two 3D motion tracking sensors and uses inverse kinematics to analyze the sensory data stream in real-time. A subject performs a rehabilitation therapy prescribed by the physician and using both forward and inverse kinematics the system validates the angular and rotational positions of the joints with respect to the correct therapeutic posture and provides live feedback to the subject. As a proof of concept, we have developed an open source web-based framework that can be easily adopted for inhome therapy, without the assistance of a therapist. Finally, we share our initial test result, which is encouraging.


Proceedings of the Third ACM SIGSPATIAL International Workshop on the Use of GIS in Public Health | 2014

A GIS-based serious game recommender for online physical therapy

Imad Afyouni; Faizan Ur Rehman; Ahmad M. Qamar; Akhlaq Ahmad; Mohamed Abdur Rahman; Saleh M. Basalamah

As human-centered interactive technologies, serious games are getting popularity in a variety of fields such as training simulations, health, national defense, and education. To build the best learning experience when designing a serious game, a system requires the integration of accurate spatio-temporal information. Also, there is an increasing need for intelligent medical technologies, which enable patients to live independently at home. This paper introduces a novel e-Health framework that leverages GIS-based serious games for people with disabilities. This framework consists of a spatial map-browsing environment augmented with our newly introduced multi-sensory Natural User Interface. We propose a comprehensive architecture that includes a sensory data manager, a storage layer, an information processing and computational intelligence layer, and a user interface layer. Detailed mathematical modeling as well as mapping methodology to convert different therapy-based hand-gestures into navigational movements within the serious game environment are also presented. Moreover, an Intelligent Game Recommender has been developed for generating optimized navigational routes based on therapeutic gestures. Motion data is stored in a repository throughout the different sessions for offline replaying and advanced analysis; and different indicators are displayed in a live manner. This framework has been tested with Nokia, Google maps, ESRI map, and other maps whereby a subject can visualize and browse the 2D and 3D map of the world through therapy-based gestures. To the best of our knowledge, this is the first GIS-based game re-commender framework for online physical therapy. The prototype has been deployed to a disability center. The obtained results and feedback from therapists and patients are very encouraging.


User Modeling and User-adapted Interaction | 2017

A therapy-driven gamification framework for hand rehabilitation

Imad Afyouni; Faizan Ur Rehman; Ahmad M. Qamar; Sohaib Ghani; Syed Osama Hussain; Bilal Sadiq; Mohamed Abdur Rahman; Abdullah Murad; Saleh M. Basalamah

Rehabilitative therapy is usually very expensive and confined to specialized rehabilitation centers or hospitals, leading to slower recovery times for corresponding patients. Therefore, there is a high demand for the development of technology-based personalized solutions to guide and encourage patients towards performing online rehabilitation program that can help them live independently at home. This paper introduces an innovative e-health framework that develops adaptive serious games for people with hand disabilities. The aim of this work is to provide a patient-adaptive environment for the gamification of hand therapies in order to facilitate and encourage rehabilitation issues. Theoretical foundations (i.e., therapy and patient models) and algorithms to match therapy-based hand gestures to navigational movements in 3D space within the serious game environment have been developed. A novel game generation module is introduced, which translates those movements into a 3D therapy-driven route on a real-world map and with different levels of difficulty based on the patient profile and capabilities. In order to enrich the user navigation experience, a 3D spatio-temporal validation region is also generated, which tracks and adjusts the patient movements throughout the session. The gaming environment also creates and adds semantics to different types of attractive and repellent objects in space depending on the difficulty level of the game. Relevant benchmarks to assess the patient interaction with the environment along with a usability and performance testing of our framework are introduced to ensure quantitative as well as qualitative improvements. Trial tests in one disability center were conducted with a total number of five subjects, having hand motor controls problems, who used our gamified physiotherapy solution to help us in measuring the usability and users’ satisfaction levels. The obtained results and feedback from therapists and patients are very encouraging.


international conference on cyber physical systems | 2016

Demo Abstract: Gesture-Based Cyber-Physical In-Home Therapy System in a Big Data Environment

Mohamed Abdur Rahman

This demo provides an overview of a gesture-based cyber-physical therapy system, which integrates entities in the physical as well as cyber world for therapy sensing, therapeutic data computation, interaction between cyber and physical world, and holistic in-home therapy support through a cloud-based big data architecture. To provide appropriate therapeutic services and environment, the CPS uses a multi-modal multimedia sensory framework to support therapy recording and playback of a therapy session and visualization of effectiveness of an assigned therapy. The physical world interaction with the cyber world is stored as a rich gesture semantics with the help of multiple media streams, which is then uploaded to a tightly synchronized cyber physical cloud environment for deducing real-time and historical whole-body Range of Motion (ROM) kinematic data.


international conference on cyber physical systems | 2016

Demo Abstract: HajjCPS - A Cyber Physical Environment for Providing Location-Aware Services to a Very Large Crowd

Mohamed Abdur Rahman; Akhlaq Ahmad

In this demo we will present a suite of smartphone applications that support location-aware, personalized multimedia services to a very large crowd with the help of a cyber computing infrastructure. As a proof of concept, we target Hajj where more than 5 million pilgrims perform spatio-temporal activities. The cyber-physical system creates an ad-hoc social network among pilgrims, their family members, Hajj authorities, vehicles that provide services to the pilgrims, medical doctors and hospitals, and the city admins. The crowdsourced multimedia data from the suite of smartphone applications are captured by the cyber environment, processed to extract context information, sent to a cloud environment for real-time query processing, and finally shared with a very large crowd. The framework has been deployed since 2014.


acm multimedia | 2015

A Multi-sensory Gesture-Based Login Environment

Ahmad M. Qamar; Abdullah Murad; Mohamed Abdur Rahman; Faizan Ur Rehman; Akhlaq Ahmad; Bilal Sadiq; Saleh M. Basalamah

Logging on to a system using a conventional keyboard may not be feasible in certain environments, such as, in a surgical operation theatre or in an industrial manufacturing facility. We have developed a multi-sensory gesture based login system that allows a user to access secure information using body gestures. The system can be configured to use different types of gestures according to the type of sensors available to the user. We have proposed a simple scheme to represent all alphanumeric characters required for password entry as gestures within the multi-sensory environment. Our scheme is scalable enough to support sensors that detect a large number of gestures to those that can only accept a few. This allows the system to be used in a variety of situations such as usage by disabled persons with limited ability to perform gestures. We are in the midst of deploying our developed system in a clinical environment.


international conference on multimedia and expo | 2016

A synchronized multimedia in-home therapy framework in big data environment

Mohamed Abdur Rahman; Abdulhameed Alelaiwi

Synchronizing multimedia data such as audio and video with the 3D depth skeletal data of a patient performing therapy at home is a challenging task. This is because the media characteristics of video, audio, and skeletal stream data are different. To keep the rich session information and therapeutic semantics, the multiple media streams need to be synchronized temporally in two tiers. At the end of the first-tier synchronization, all the media are synchronized with respect to a global timestamp, which represents a users in-home therapy session. Once synchronized at the stream level, the media streams have to be further synchronized with respect to a model therapy skeletal stream to make semantic annotation or marker on top of the media streams. This two-tier synchronized multimedia streams represent a patients in-home therapy session that can be saved to a big data repository for further analysis. The big data repository uses map reduce functions to extract key quality of improvement metrics from the user session such as “a patient could successfully follow the therapist instruction”, “how much of user session is done correctly”, and “how many gestures were done wrongly” etc. We design a therapy recorder that can perform the two-tier synchronization process and create the synchronized multimedia therapy session file. We also propose a therapy player that can unpack the complex session file and separate the media files while keeping the synchronization among the media. A therapist can use the player to observe the user session, browse the synchronized media, both spatially and temporally, and add his/her comments in the form of audio notes, video notes or text notes on any particular temporal position. A patient can observe the annotations made by the therapist using the playback mode of the player, and visualize multimedia annotated notes to improve future sessions. The query interface is packed with the features to see the statistics or graph plots of any individual session of a patient or a summary of historical session data of a patient or can observe complex relative statistics among a patient group.


Sigspatial Special | 2016

Gamifying hand physical therapy with intelligent 3D navigation

Imad Afyouni; Faizan Ur Rehman; Ahmad M. Qamar; Akhlaq Ahmad; Mohamed Abdur Rahman; Sohaib Ghani; Saleh M. Basalamah

As human-centered interactive technologies, serious games are getting popularity in a variety of fields such as training simulations, health, national defense, and education. To build the best learning experience when designing a serious game, a system requires the integration of accurate spatio-temporal information. Also, there is an increasing need for intelligent medical technologies, which enable patients to live independently at home. This paper introduces a novel e-Health framework that leverages GIS-based serious games for people with disabilities. This framework consists of a spatio-temporal map-browsing environment augmented with our newly introduced multi-sensory natural user interface. We propose a comprehensive architecture that includes a sensory data manager, a storage layer, an information processing and computational intelligence layer, and a user interface layer. Detailed mathematical modeling as well as mapping methodology to convert different therapy-based hand-gestures into navigational movements within the serious game environment are also presented. Moreover, an Intelligent Game Recommender has been developed for generating optimized navigational routes based on therapeutic gestures. Those routes are tailored to the patient preferences and capabilities. Motion data is stored in a repository throughout the different sessions for offline replaying and advanced analysis; and different indicators are displayed in a live manner. To the best of our knowledge, this is the first GIS-based game recommender framework for online physical therapy. The prototype has been deployed to a disability center. The obtained results and feedback from therapists and patients are encouraging.

Collaboration


Dive into the Mohamed Abdur Rahman's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Bilal Sadiq

Umm al-Qura University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Mohamed Ahmed

National Research Council

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge