Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Karan Khokar is active.

Publication


Featured researches published by Karan Khokar.


international conference on advanced intelligent mechatronics | 2013

A vision based P300 Brain Computer Interface for grasping using a wheelchair-mounted robotic arm

Indika Upashantha Pathirage; Karan Khokar; Elijah Klay; Redwan Alqasemi; Rajiv V. Dubey

In this paper, we present a novel vision based interface for selecting an object using a Brain Computer Interface (BCI), and grasping it using a robotic arm mounted to a powered wheelchair. As issuing commands through BCI is slow, this system was designed to allow a user to perform a complete task using the robotic system via the BCI issuing as few commands as possible, without losing concentration on the stimuli or the task. A scene image is captured by a camera mounted on the wheelchair, from which a dynamically sized non-uniform stimulus grid is created using edge information. Dynamically sized grids improve object selection efficiency. Oddball paradigm and P300 event related potentials (ERP) are used to select stimuli, the stimuli being each cell in the grid. Once selected, object segmentation and matching is used to identify the object. Then the user, using BCI, chooses an action to be performed on the object via the wheelchair mounted robotic arm (WMRA). Tests on 6 healthy human subjects validated the functionality of the system. An average accuracy of 85.56% was achieved for stimuli selection over all subjects. With the proposed system, it took the users an average of 5 commands to grasp an object. The system will eventually be useful for completely paralyzed or locked-in patients for performing activities of daily living (ADL) tasks.


intelligent robots and systems | 2010

Laser-assisted telerobotic control for enhancing manipulation capabilities of persons with disabilities

Karan Khokar; Kyle B. Reed; Redwan Alqasemi; Rajiv V. Dubey

In this paper, we demonstrate the use of range information from a laser sensor mounted on the end-effector of a remote robot manipulator to assist persons with limited upper body strength carry out Activities of Daily Living in unstructured environments. Laser range data is used to determine goals, identify targets, obstacles, and via points to enable autonomous execution of trajectories. The human operator is primarily involved in higher level decision making; the human user performs minimal teleoperation to identify critical points in the workspace with the laser pointer. Tests on ten healthy human subjects in executing a pick-and-place task showed that laser-based assistance not only increased the speed of task execution by an average of 26.9%while decreasing the physical effort by an average of 85.4%, but also made the task cognitively easier for the user to execute.


Journal of The Franklin Institute-engineering and Applied Mathematics | 2012

Scaled telerobotic control of a manipulator in real time with laser assistance for ADL tasks

Eduardo Veras; Karan Khokar; Redwan Alqasemi; Rajiv V. Dubey

In this paper we present a novel concept of shared autonomous and teleoperation control of a remote manipulator with a laser-based assistance in a hard real-time environment for people with disabilities to perform activities of daily living (ADL). The laser pointer enables the user to make high-level decisions, such as target object selection, and it enables the system to generate a trajectories and virtual constraints to be used for autonomous motion or scaled teleoperation. Autonomous, position-teleoperation and velocity-teleoperation control methods have been implemented in the control code. Scaling and virtual fixtures have been used in the teleoperation-based control depending on the user preference. A real-time QNX operating system has been used to control a PUMA 560 robotic arm using an Phantom Omni master through a TCP/IP port. A SICK laser range finder was used to for the telerobotic control. The system was implemented with different control modes, and three healthy human subjects were trained to use the system for a pick-and-place task. Data were collected and presented for different control modes, and a comparison between these modes based on the time to complete the task was presented.


Proceedings of SPIE | 2009

Laser-assisted real-time and scaled telerobotic control of a manipulator for defense and security applications

Eduardo Veras; Karan Khokar; Redwan Alqasemi; Rajiv V. Dubey

In this paper, we present a novel concept of shared autonomous and teleoperation control of a remote manipulator with a laser-based assistance in a hard real-time environment for defense and security applications. The laser pointer enables the user to make high-level decisions, such as target object selection, and it enables the system to generate trajectories and virtual constraints to be used for autonomous motion or scaled teleoperation. Autonomous, position-teleoperation and velocity-teleoperation control methods have been implemented in the control code. Scaling and virtual fixtures have been used in the teleoperation-based control, depending on the user preference, for faster and easier target locking and task execution. A real-time QNX operating system has been used to remotely control a PUMA 560 robotic arm using a Phantom Omni haptic device as a master through a TCP/IP port. The system was implemented with different control modes, and human subjects were trained to use the system to execute several tasks. Examples of defense and security applications were explored and presented.


ieee international conference on rehabilitation robotics | 2013

Human motion intention based scaled teleoperation for orientation assistance in preshaping for grasping

Karan Khokar; Redwan Alqasemi; Sudeep Sarkar; Rajiv V. Dubey

In this paper, we present an algorithm that provides human motion intention based assistance to users teleoperating a remote gripper for preshaping over an object in order to grasp it. Human motion data from the remote arm is used to train a Hidden Markov Model (HMM) offline. During the execution of a grasping task, the motion data is processed in real time through the HMM to determine the intended preshape configuration of the user. Based on the intention, the motion of the remote arm is scaled up in those orientation directions that lead to the desired configuration, thus providing the necessary assistance to the user to preshape for grasping. Tests on healthy human subjects validated the hypothesis that the users are able to preshape quicker and with much ease. Average time savings of 36% were obtained.


international conference on robotics and automation | 2014

A novel telerobotic method for human-in-the-loop assisted grasping based on intention recognition.

Karan Khokar; Redwan Alqasemi; Sudeep Sarkar; Kyle B. Reed; Rajiv V. Dubey

In this work, we present a methodology for enabling a robot to identify an object and grasp configuration of interest and assist the human teleoperating the robot, to grasp the object. The identification is carried out in real-time by detecting the motion intention of the human as they are teleoperating the remote robotic arm towards the object and the grasp configuration. Simultaneously, depending on the detected object and grasp configuration, the human user is assisted to translate and orient the remote arm gripper in order to preshape and grasp the object. The complete process occurs with the human teleoperating the arm, and without them having to interact with another interface. Motion intention recognition is carried out by using Hidden Markov Models (HMMs), trained offline by preshape trials performed by a skilled teleoperator. The environment is unstructured and comprises of a number of objects, each with multiple grasp configurations. Experimental tests on healthy human subjects have validated our intention recognition based assistance method. They show that the method allows objects to be grasped and placed 48% faster, and with much ease compared to unassisted teleoperation. Moreover, we have proved that the model for intention recognition, trained by a skilled teleoperator, can be used by novice users to efficiently execute a grasping task in teleoperation.


ASME 2008 Summer Bioengineering Conference, Parts A and B | 2008

Implementation of a Real-Time Telerobotic System for Generating an Assistive Force Feedback for Rehabilitation Applications

Eduardo Veras; Karan Khokar; Kathryn J. De Laurentis; Rajiv V. Dubey

In this paper we describe the implementation of a system that gives assistive force feedback to the remote user in a teleoperation based environment. The force feedback would help the user in trajectory following exercises for stroke rehabilitation. Exercises in virtual environments on a PC as well as real world exercises with a remote robotic arm, that would follow a trajectory in real-time as the user moves the master device, could be performed. Such real world exercises augmented with real-time force feedback can make the exercises more effective as the user gets force assistance along the desired path. Moreover, the system can find its application in remote therapy, where the therapist is away from the user, as it can be teleoperated and has internet based protocols. The assistive force feedback has been implemented using simple sensors such as the camera and the laser and a PC-based real-time multithreaded control system. The real-time force feedback from the remote robot to the master device has been possible using effective multithreading programming strategies in the control system design and novel sensor integration. The system also has the capabilities of autonomous as well as supervisory control of the remote robot and is modular as far as integration of different master devices for stroke rehabilitation exercises is concerned.Copyright


Archive | 2014

Vision based brain-computer interface systems for performing activities of daily living

Indika Upashantha Pathirage; Redwan Alqasemi; Rajiv V. Dubey; Karan Khokar; Elijah Klay


Archive | 2011

Laser Assisted Combined Teleoperation and Autonomous Control

Karan Khokar; Kyle B. Reed; Rajiv V. Dubey


international conference on informatics in control, automation and robotics | 2010

LASER BASED TELEROBOTIC CONTROL FOR ASSISTING PERSONS WITH DISABILITIES PERFORM ACTIVITIES OF DAILY LIVING

Karan Khokar; Redwan Alqasemi; Rajiv V. Dubey

Collaboration


Dive into the Karan Khokar's collaboration.

Top Co-Authors

Avatar

Rajiv V. Dubey

University of South Florida

View shared research outputs
Top Co-Authors

Avatar

Redwan Alqasemi

University of South Florida

View shared research outputs
Top Co-Authors

Avatar

Eduardo Veras

University of South Florida

View shared research outputs
Top Co-Authors

Avatar

Kyle B. Reed

University of South Florida

View shared research outputs
Top Co-Authors

Avatar

Elijah Klay

University of South Florida

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Sudeep Sarkar

University of South Florida

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge