Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Ker-Jiun Wang is active.

Publication


Featured researches published by Ker-Jiun Wang.


international conference on industrial technology | 2016

Human-robot symbiosis framework on exoskeleton devices

Ker-Jiun Wang; Mingui Sun; Ruiping Xia; Zhi-Hong Mao

In the near future, human-robot coexistence and symbiosis will be a common scenario in our society. Especially with the increasing number of patients with stroke or other neurological disorders and the gradually aging population, people may need wearable exoskeletons that actively assist humans movements. In designing these robots, physical humanrobot interaction (pHRI) plays an important role. How to let human and robot cooperatively perform motor tasks and help each other is a grand challenge. Our research establishes a human-robot physical symbiosis framework that biomimics humans behavior when performing interactive motor skills. The human and robot are modeled as two adaptive controllers in parallel with the plant (system under control). As a result, we will have two feedback controllers working together, constantly adapting to each others behavior and optimally stabilizing the plant to achieve a common goal. In addition, we propose an inverse optimal control method to estimate human control strategy. This information can enable the robot to predict future consensus interactive behaviors in order to cooperate with the human effectively. Experimental verifications have been carried out using double inverted pendulum to simulate a human-robot cooperative balance task in MATLAB environment.


human robot interaction | 2018

Development of Seamless Telepresence Robot Control Methods to Interact with the Environment Using Physiological Signals

Ker-Jiun Wang; Kaiwen You; Fangyi Chen; Prakash C. Thakur; Michael Urich; Soumya Vhasure; Zhi-Hong Mao

Current assistive devices to help disabled people interact with the environment are complicated and cumbersome. Our approach aims to solve these problems by developing a compact and non-obtrusive wearable device to measure signals associated with human physiological gestures, and therefore generate useful commands to interact with the smart environment. Our innovation uses machine learning and non-invasive biosensors on top of the ears to identify eye movements and facial expressions. With these identified signals, users can control different applications, such as a cell phone, powered wheelchair, smart home, or other IoT (Internet of Things) devices with simple and easy operations. Combined with VR headset, the user can use our technology to control a camera-mounted telepresence robot to navigate around the environment in the first-persons view (FPV) by eye movements and facial expressions. It enables a very intuitive way of interaction totally hands-free and touch-free.


Archive | 2017

Human-Robot Mutual Force Borrowing and Seamless Leader-Follower Role Switching by Learning and Coordination of Interactive Impedance

Ker-Jiun Wang; Mingui Sun; Zhi-Hong Mao

This research developed a bilateral human-robot mutual adaptive impedance control strategy. The developed interactive impedance coordination methods can let human and robot arbitrarily switch the role between leader and follower seamlessly. Also, through iteratively increasing the impedance in the intended moving direction, human and robot can mutually borrow the force from each other to facilitate the task execution.


2016 9th International Conference on Service Science (ICSS) | 2016

Internet of Brain: Decoding Human Intention and Coupling EEG Signals with Internet Services

Lan Zhang; Ker-Jiun Wang; Huan Chen; Zhi-Hong Mao

Films such as Robocop, The Matrix, and Pacific Rim have explored the possibilities of using Brain-Computer Interfaces (BCIs) to control machines with only thought. In this paper, we enhance the power of thought through a novel method of combining the powers of the brain wave, the most influential signal in the human body, and the wide range of information found on the internet during todays digital age. Because brain wave signals are affected by human thought, certain patterns of EEG (Electroencephalogram) signals can be associated with specific actions or ideas. By analyzing these characteristics of brain wave signals through powerful EEG capturing devices, we were able to create a user interface where a human can have access to a variety of internet services, including search results, social media and IFTTT, by merely thinking of or focusing on a word or phrase on a computer screen.


joint ieee international conference on development and learning and epigenetic robotics | 2015

Mastering human-robot interaction control techniques using Chinese Tai Chi Chuan: Mutual learning, intention detection, impedance adaptation, and force borrowing

Ker-Jiun Wang; Mingui Sun; Lan Zhang; Zhi-Hong Mao

Physical human-robot interaction (pHRI) has become an important research topic in recent years. It involves close physical body contact between human and robots, which is a very critical technology to enable human-robot symbiosis for our future society. An illustrative example is the development of wearable robots [1], where the wearer can extend or enhance functionalities of his limbs. Since wearable robots are worn in parallel and moving synchronously with the human body, human-in-the-loop control plays an important role. Human and robot are no longer two separate entities that make their own decisions. In contrast, they jointly react to the world according to their mutual behaviors and control strategies. Any intelligent decision making of each controller has to consider the other ones changing dynamics as part of its feedback loop, which is a two-way bilateral control structure.


international conference on rehabilitation robotics | 2017

Brain-computer interface combining eye saccade two-electrode EEG signals and voice cues to improve the maneuverability of wheelchair

Ker-Jiun Wang; Lan Zhang; Bo Luan; Hsiao-Wei Tung; Quanfeng Liu; Jiacheng Wei; Mingui Sun; Zhi-Hong Mao


international conference on consumer electronics | 2018

Ergonomic and Human-Centered Design of Wearable Gaming Controller Using Eye Movements and Facial Expressions

Ker-Jiun Wang; Anna Zhang; Kaiwen You; Fangyi Chen; Quanbo Liu; Yu Liu; Zaiwang Li; Hsiao-Wei Tung; Zhi-Hong Mao


human robot interaction | 2018

EXGbuds: Universal Wearable Assistive Device for Disabled People to Interact with the Environment Seamlessly

Ker-Jiun Wang; Hsiao-Wei Tung; Zihang Huang; Prakash C. Thakur; Zhi-Hong Mao; Ming-Xian You


conference on the future of the internet | 2018

Design and Realization of Alzheimer's Artificial Intelligence Technologies (AAIT) System

Anna Zhang; Ker-Jiun Wang; Zhi-Hong Mao


2017 International Symposium on Wearable Robotics and Rehabilitation (WeRob) | 2017

Human-machine interface using eye saccade and facial expression physiological signals to improve the maneuverability of wearable robots

Ker-Jiun Wang; Kaiwen You; Fangyi Chen; Zihang Huang; Zhi-Hong Mao

Collaboration


Dive into the Ker-Jiun Wang's collaboration.

Top Co-Authors

Avatar

Zhi-Hong Mao

University of Pittsburgh

View shared research outputs
Top Co-Authors

Avatar

Mingui Sun

University of Pittsburgh

View shared research outputs
Top Co-Authors

Avatar

Fangyi Chen

University of Pittsburgh

View shared research outputs
Top Co-Authors

Avatar

Hsiao-Wei Tung

University of Pittsburgh

View shared research outputs
Top Co-Authors

Avatar

Kaiwen You

University of Pittsburgh

View shared research outputs
Top Co-Authors

Avatar

Lan Zhang

University of Pittsburgh

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Zihang Huang

University of Pittsburgh

View shared research outputs
Top Co-Authors

Avatar

Bo Luan

University of Pittsburgh

View shared research outputs
Researchain Logo
Decentralizing Knowledge