Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jingsheng Tang is active.

Publication


Featured researches published by Jingsheng Tang.


Computers in Biology and Medicine | 2016

Toward brain-actuated car applications

Yang Yu; Zongtan Zhou; Erwei Yin; Jun Jiang; Jingsheng Tang; Yadong Liu; Dewen Hu

This study presented a paradigm for controlling a car using an asynchronous electroencephalogram (EEG)-based brain-computer interface (BCI) and presented the experimental results of a simulation performed in an experimental environment outside the laboratory. This paradigm uses two distinct MI tasks, imaginary left- and right-hand movements, to generate a multi-task car control strategy consisting of starting the engine, moving forward, turning left, turning right, moving backward, and stopping the engine. Five healthy subjects participated in the online car control experiment, and all successfully controlled the car by following a previously outlined route. Subject S1 exhibited the most satisfactory BCI-based performance, which was comparable to the manual control-based performance. We hypothesize that the proposed self-paced car control paradigm based on EEG signals could potentially be used in car control applications, and we provide a complementary or alternative way for individuals with locked-in disorders to achieve more mobility in the future, as well as providing a supplementary car-driving strategy to assist healthy people in driving a car.


international conference on artificial intelligence | 2017

A 3D Visual Stimuli Based P300 Brain-computer Interface: for a Robotic Arm Control

Jingsheng Tang; Zongtan Zhou; Yadong Liu

The brain-computer interface (BCI) is a technology that allows human brain to directly control devices. Among various kinds of BCI based system, the brain actuated robotic arm is a meaningful research for many fields. Due to the multiple freedom of robotic arm, the evoked potential is usually adopted to build the BCI in which a computer displayer is usually adopted as the stimulation interface. However, the introducing of the displayer is usually leading to the distracting for the user. To alleviate this problem, we designed a LED stimulator to mount on the end-effector of the robotic arm so that the stimulator and the controlled devices are integrated together. This stimulator includes six LEDs, and each LED positioned on the corresponding direction of the robotic arm. In this way, the visual stimulus is overlapped on the device. Therefore, the distracting in robotic arm is effectively restrained. Additionally, a slide-window based P300 classifier is designed to fast identify human intent. With this system, human could control the robotic arm in real-time. Online experiment is performed; three subjects participated in the test. They are instructed to control the robotic arm to grasp a bottle on the desk and place it in the target box. Each subject takes 10 trials, and all of them could complete more than seven tasks. The results proved the effectiveness of our system.


international symposium on computational intelligence and design | 2013

Balancing an Inverted Pendulum with an EEG-Based BCI

Jingsheng Tang; Erwei Yin; Jun Jiang; Zongtan Zhou; Dewen Hu

To research the brain computer interface (BCI) for dynamic objects control, in this study, we constructed a BCI paradigm for balancing a virtual inverted pendulum on a cart (IPC). In the paradigm, subjects balanced the pendulum by imaging left/right movements. Not only was the direction, but also the strength of motor imagery was estimated simultaneously from the EEG signals, to generate suitable control force for IPC. Additionally, to solve the inconsistent problem between offline training and online controlling, a special online training experiment was designed to obtain more robust parameters of BCI. Three graduate subjects participated in this study, and two of them fast grasped the skill of IPC balancing, achieved balancing time of about 20 seconds. The results showed that the paradigm in this study was feasible and efficient for dynamic objects control.


International Journal of Human-computer Interaction | 2018

Toward Brain-Actuated Mobile Platform

Jingsheng Tang; Yadong Liu; Jun Jiang; Yang Yu; Dewen Hu; Zongtan Zhou

ABSTRACT This study presents a brain–computer interface (BCI) system aimed at providing disabled patients with mobile solutions for practical use. The proposed system employs an omnidirectional chassis and a bionic robot arm to construct a multi-functional mobile platform. In addition, the system is equipped with a Kinect and 12 ultrasonic sensors to capture environment information. Based on artificial intelligence technology, the mobile system can understand the environment and smartly completes certain tasks. A hybrid BCI combined with movement imagery paradigm and asynchronous P300 paradigm is designed to translate human intent to computer commands. The users interact with the system in a flexible way: on the one hand, the user issues commands to drive the system directly; on the other hand, the system searches for predefined operable targets and reports the results to the user. Once the user confirms the target, the system will automatically complete the associated operation. To evaluate the system’s performance, a testing environment with a small room, aisle, and an elevator was built to simulate the mobile tasks in the daily scene. Participants were instructed to operate the mobile system in the room, aisle, and using the elevator to go outdoors. In this study, four subjects participated in the test, and all of them completed the task.


Biomedical Engineering Online | 2018

Towards BCI-actuated smart wheelchair system

Jingsheng Tang; Yadong Liu; Dewen Hu; Zongtan Zhou

BackgroundElectroencephalogram-based brain–computer interfaces (BCIs) represent novel human machine interactive technology that allows people to communicate and interact with the external world without relying on their peripheral muscles and nervous system. Among BCI systems, brain-actuated wheelchairs are promising systems for the rehabilitation of severely motor disabled individuals who are unable to control a wheelchair by conventional interfaces. Previous related studies realized the easy use of brain-actuated wheelchairs that enable people to navigate the wheelchair through simple commands; however, these systems rely on offline calibration of the environment. Other systems do not rely on any prior knowledge; however, the control of the system is time consuming. In this paper, we have proposed an improved mobile platform structure equipped with an omnidirectional wheelchair, a lightweight robotic arm, a target recognition module and an auto-control module. Based on the you only look once (YOLO) algorithm, our system can, in real time, recognize and locate the targets in the environment, and the users confirm one target through a P300-based BCI. An expert system plans a proper solution for a specific target; for example, the planned solution for a door is opening the door and then passing through it, and the auto-control system then jointly controls the wheelchair and robotic arm to complete the operation. During the task execution, the target is also tracked by using an image tracking technique. Thus, we have formed an easy-to-use system that can provide accurate services to satisfy user requirements, and this system can accommodate different environments.ResultsTo validate and evaluate our system, an experiment simulating the daily application was performed. The tasks included the user driving the system closer to a walking man and having a conversation with him; going to another room through a door; and picking up a bottle of water on the desk and drinking water. Three patients (cerebral infarction; spinal injury; and stroke) and four healthy subjects participated in the test and all completed the tasks.ConclusionThis article presents a brain-actuated smart wheelchair system. The system is intelligent in that it provides efficient and considerate services for users. To test the system, three patients and four healthy subjects were recruited to participate in a test. The results demonstrate that the system works smartly and efficiently; with this system, users only need to issue small commands to get considerate services. This system is of significance for accelerating the application of BCIs in the practical environment, especially for patients who will use a BCI for rehabilitation applications.


international conference on information technology in medicine and education | 2016

A Hybrid Computer Interface for Robot Arm Control

Jingsheng Tang; Zongtan Zhou; Yang Yu

Brain-computer interface (BCI) directly translate human thought into machine command. It provides a new and promising method for rehabilitation of persons with disabilities. BCI actuated robotic arm is an effective rehabilitation way for patients with upper limb disability. Based on the study and reference of the existing brain-controlled robot arm, this paper proposed a method of combining electromyography (EMG) and Electroencephalogram (EEG) to control the manipulator. Specifically, we collect EMG signals from the human leg and use the leg movements to quickly and reliably select the joints which are currently activated. The robot arm joints are precisely controlled by movement imagination (MI) brain-computer interfaces. The use of two non-homologous signals, scattered the burden of the brain and therefore reduce the work load. In addition, the program allows two kinds of operations at the same time, so the program is flexible and efficient. Offline experiment was designed to construct the classifier and optimal parameters. In the online experiment, subjects were instructed to control the robot arm to move an object from one location to another. Three subjects participated in the experiment, the accuracy rates of classifiers in the offline experiment were exceeded 95% and they all completed the online control.


international conference on intelligent science and big data engineering | 2013

A Novel Multi-class Brain-Computer Interface (BCI) Paradigm Based on Motor Imagery Sequential Coding (MISC) Protocol

Jun Jiang; Erwei Yin; Yang Yu; Jingsheng Tang; Zongtan Zhou; Dewen Hu

In this study, we present a novel multi-class BCI paradigm based on motor imagery sequential coding (MISC) protocol, which can generate multiple commands just by two kinds of motor imagery (MI) tasks. In the MISC protocol, each mental task was divided into several continuous epochs with the same duration. During each epoch, one of the two MI tasks was executed. With this protocol, multiple mental states can be coded by the two MI tasks. Additionally, the difficulty of classifier design was also reduced as only two MI tasks were needed to be classified. Three subjects participated in our experiments, and achieved an average accuracy of 85.7%, with the ITR of 16.5 bits/min. The results confirmed that the MISC protocol can generate more commands in BCI system with the equal number of MI tasks.


international conference on intelligent science and big data engineering | 2013

A Subarea-Location Joint Spelling Paradigm for the BCI Control

Erwei Yin; Jun Jiang; Yang Yu; Jingsheng Tang; Zongtan Zhou; Dewen Hu

Brain computer interface (BCI) speller is an important issue in BCI research. In this paper, we propose a novel spelling paradigm for enhancing the performance of BCI speller. In our approach, the target character is detected by the combination of the P300 potential and the steady-state visual evoked potential (SSVEP). Specifically, the P300 detection mechanism and the SSVEP detection mechanism are employed as two sub-spellers for identifying the number of the subarea and location of target character, respectively and simultaneously. The experimental results show that the information transfer rate (ITR) of our BCI system was significantly improved compared to the traditional BCI approaches, i.e. P300 speller and SSVEP speller.


international conference on instrumentation and measurement computer communication and control | 2014

A Novel Steady-State Visually Evoked Potential-Based Brain-Computer-Interface Paradigm to Steer a Humanoid Robot

Nannan Zhang; Jun Jiang; Jingsheng Tang; Zongtan Zhou; Dewen Hu


chinese control conference | 2018

An Optimizational Tactile P300 Brain-Computer Interface Paradigm

Boyan Cao; Xing Han; Jingsheng Tang; Zongtan Zhou; Yadong Liu

Collaboration


Dive into the Jingsheng Tang's collaboration.

Top Co-Authors

Avatar

Zongtan Zhou

National University of Defense Technology

View shared research outputs
Top Co-Authors

Avatar

Dewen Hu

National University of Defense Technology

View shared research outputs
Top Co-Authors

Avatar

Yadong Liu

National University of Defense Technology

View shared research outputs
Top Co-Authors

Avatar

Jun Jiang

National University of Defense Technology

View shared research outputs
Top Co-Authors

Avatar

Yang Yu

National University of Defense Technology

View shared research outputs
Top Co-Authors

Avatar

Erwei Yin

National University of Defense Technology

View shared research outputs
Top Co-Authors

Avatar

Nannan Zhang

National University of Defense Technology

View shared research outputs
Top Co-Authors

Avatar

Xing Han

National University of Defense Technology

View shared research outputs
Top Co-Authors

Avatar

Yaru Liu

National University of Defense Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge