Ker-Jiun Wang
University of Pittsburgh
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Ker-Jiun Wang.
international conference on industrial technology | 2016
Ker-Jiun Wang; Mingui Sun; Ruiping Xia; Zhi-Hong Mao
In the near future, human-robot coexistence and symbiosis will be a common scenario in our society. Especially with the increasing number of patients with stroke or other neurological disorders and the gradually aging population, people may need wearable exoskeletons that actively assist humans movements. In designing these robots, physical humanrobot interaction (pHRI) plays an important role. How to let human and robot cooperatively perform motor tasks and help each other is a grand challenge. Our research establishes a human-robot physical symbiosis framework that biomimics humans behavior when performing interactive motor skills. The human and robot are modeled as two adaptive controllers in parallel with the plant (system under control). As a result, we will have two feedback controllers working together, constantly adapting to each others behavior and optimally stabilizing the plant to achieve a common goal. In addition, we propose an inverse optimal control method to estimate human control strategy. This information can enable the robot to predict future consensus interactive behaviors in order to cooperate with the human effectively. Experimental verifications have been carried out using double inverted pendulum to simulate a human-robot cooperative balance task in MATLAB environment.
human robot interaction | 2018
Ker-Jiun Wang; Kaiwen You; Fangyi Chen; Prakash C. Thakur; Michael Urich; Soumya Vhasure; Zhi-Hong Mao
Current assistive devices to help disabled people interact with the environment are complicated and cumbersome. Our approach aims to solve these problems by developing a compact and non-obtrusive wearable device to measure signals associated with human physiological gestures, and therefore generate useful commands to interact with the smart environment. Our innovation uses machine learning and non-invasive biosensors on top of the ears to identify eye movements and facial expressions. With these identified signals, users can control different applications, such as a cell phone, powered wheelchair, smart home, or other IoT (Internet of Things) devices with simple and easy operations. Combined with VR headset, the user can use our technology to control a camera-mounted telepresence robot to navigate around the environment in the first-persons view (FPV) by eye movements and facial expressions. It enables a very intuitive way of interaction totally hands-free and touch-free.
Archive | 2017
Ker-Jiun Wang; Mingui Sun; Zhi-Hong Mao
This research developed a bilateral human-robot mutual adaptive impedance control strategy. The developed interactive impedance coordination methods can let human and robot arbitrarily switch the role between leader and follower seamlessly. Also, through iteratively increasing the impedance in the intended moving direction, human and robot can mutually borrow the force from each other to facilitate the task execution.
2016 9th International Conference on Service Science (ICSS) | 2016
Lan Zhang; Ker-Jiun Wang; Huan Chen; Zhi-Hong Mao
Films such as Robocop, The Matrix, and Pacific Rim have explored the possibilities of using Brain-Computer Interfaces (BCIs) to control machines with only thought. In this paper, we enhance the power of thought through a novel method of combining the powers of the brain wave, the most influential signal in the human body, and the wide range of information found on the internet during todays digital age. Because brain wave signals are affected by human thought, certain patterns of EEG (Electroencephalogram) signals can be associated with specific actions or ideas. By analyzing these characteristics of brain wave signals through powerful EEG capturing devices, we were able to create a user interface where a human can have access to a variety of internet services, including search results, social media and IFTTT, by merely thinking of or focusing on a word or phrase on a computer screen.
joint ieee international conference on development and learning and epigenetic robotics | 2015
Ker-Jiun Wang; Mingui Sun; Lan Zhang; Zhi-Hong Mao
Physical human-robot interaction (pHRI) has become an important research topic in recent years. It involves close physical body contact between human and robots, which is a very critical technology to enable human-robot symbiosis for our future society. An illustrative example is the development of wearable robots [1], where the wearer can extend or enhance functionalities of his limbs. Since wearable robots are worn in parallel and moving synchronously with the human body, human-in-the-loop control plays an important role. Human and robot are no longer two separate entities that make their own decisions. In contrast, they jointly react to the world according to their mutual behaviors and control strategies. Any intelligent decision making of each controller has to consider the other ones changing dynamics as part of its feedback loop, which is a two-way bilateral control structure.
international conference on rehabilitation robotics | 2017
Ker-Jiun Wang; Lan Zhang; Bo Luan; Hsiao-Wei Tung; Quanfeng Liu; Jiacheng Wei; Mingui Sun; Zhi-Hong Mao
international conference on consumer electronics | 2018
Ker-Jiun Wang; Anna Zhang; Kaiwen You; Fangyi Chen; Quanbo Liu; Yu Liu; Zaiwang Li; Hsiao-Wei Tung; Zhi-Hong Mao
human robot interaction | 2018
Ker-Jiun Wang; Hsiao-Wei Tung; Zihang Huang; Prakash C. Thakur; Zhi-Hong Mao; Ming-Xian You
conference on the future of the internet | 2018
Anna Zhang; Ker-Jiun Wang; Zhi-Hong Mao
2017 International Symposium on Wearable Robotics and Rehabilitation (WeRob) | 2017
Ker-Jiun Wang; Kaiwen You; Fangyi Chen; Zihang Huang; Zhi-Hong Mao