Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Kongqiao Wang is active.

Publication


Featured researches published by Kongqiao Wang.


systems man and cybernetics | 2011

A Framework for Hand Gesture Recognition Based on Accelerometer and EMG Sensors

Xu Zhang; Xiang Chen; Yun Li; Vuokko Lantz; Kongqiao Wang; Ji-hai Yang

This paper presents a framework for hand gesture recognition based on the information fusion of a three-axis accelerometer (ACC) and multichannel electromyography (EMG) sensors. In our framework, the start and end points of meaningful gesture segments are detected automatically by the intensity of the EMG signals. A decision tree and multistream hidden Markov models are utilized as decision-level fusion to get the final results. For sign language recognition (SLR), experimental results on the classification of 72 Chinese Sign Language (CSL) words demonstrate the complementary functionality of the ACC and EMG sensors and the effectiveness of our framework. Additionally, the recognition of 40 CSL sentences is implemented to evaluate our framework for continuous SLR. For gesture-based control, a real-time interactive system is built as a virtual Rubiks cube game using 18 kinds of hand gestures as control commands. While ten subjects play the game, the performance is also examined in user-specific and user-independent classification. Our proposed framework facilitates intelligent and natural control in gesture-based interaction.


intelligent user interfaces | 2009

Hand gesture recognition and virtual game control based on 3D accelerometer and EMG sensors

Xu Zhang; Xiang Chen; Wen-hui Wang; Ji-hai Yang; Vuokko Lantz; Kongqiao Wang

This paper describes a novel hand gesture recognition system that utilizes both multi-channel surface electromyogram (EMG) sensors and 3D accelerometer (ACC) to realize user-friendly interaction between human and computers. Signal segments of meaningful gestures are determined from the continuous EMG signal inputs. Multi-stream Hidden Markov Models consisting of EMG and ACC streams are utilized as decision fusion method to recognize hand gestures. This paper also presents a virtual Rubiks Cube game that is controlled by the hand gestures and is used for evaluating the performance of our hand gesture recognition system. For a set of 18 kinds of gestures, each trained with 10 repetitions, the average recognition accuracy was about 91.7% in real application. The proposed method facilitates intelligent and natural control based on gesture interaction.


international symposium on wearable computers | 2007

Hand Gesture Recognition Research Based on Surface EMG Sensors and 2D-accelerometers

Xiang Chen; Xu Zhang; Zhangyan Zhao; Ji-Hai Yang; Vuokko Lantz; Kongqiao Wang

For realizing multi-DOF interfaces in wearable computer system, accelerometers and surface EMG sensors are used synchronously to detect hand movement information for multiple hand gesture recognition. Experiments were designed to collect gesture data with both sensing techniques to compare their performance in the recognition of various wrist and finger gestures. Recognition tests were run using different subsets of information: accelerometer and sEMG data separately and combined sensor data. Experimental results show that the combination of sEMG sensors and accelerometers achieved 5-10% improvement in the recognition accuracies for hand gestures when compared to that obtained using sEMG sensors solely.


international conference on bioinformatics and biomedical engineering | 2007

Multiple Hand Gesture Recognition Based on Surface EMG Signal

Xiang Chen; Xu Zhang; Zhangyan Zhao; Ji-Hai Yang; Vuokko Lantz; Kongqiao Wang

For realizing a multi-DOF myoelectric control system with a minimal number of sensors, research work on the recognition of twenty-four hand gestures based on two-channel surface EMG signal measured from human forearm muscles has been carried out. Third-order AR model coefficients, Mean Absolute Value and Mean Absolute Value ratio of the sEMG signal segments were used as features and the recognition of gestures was performed with a linear Bayesian classifier. Our experimental results show that the proposed two sensors setup and the sEMG signal processing and recognition methods are well suited for distinguishing hand gestures consisting of various wrist motions and single finger extension.


international conference on multimodal interfaces | 2010

Automatic recognition of sign language subwords based on portable accelerometer and EMG sensors

Yun Li; Xiang Chen; Jianxun Tian; Xu Zhang; Kongqiao Wang; Ji-hai Yang

Sign language recognition (SLR) not only facilitates the communication between the deaf and hearing society, but also serves as a good basis for the development of gesture-based human-computer interaction (HCI). In this paper, the portable input devices based on accelerometers and surface electromyography (EMG) sensors worn on the forearm are presented, and an effective fusion strategy for combination of multi-sensor and multi-channel information is proposed to automatically recognize sign language at the subword classification level. Experimental results on the recognition of 121 frequently used Chinese sign language subwords demonstrate the feasibility of developing SLR system based on the presented portable input devices and that our proposed information fusion method is effective for automatic SLR. Our study will promote the realization of practical sign language recognizer and multimodal human-computer interfaces.


IEEE Transactions on Biomedical Engineering | 2012

A Sign-Component-Based Framework for Chinese Sign Language Recognition Using Accelerometer and sEMG Data

Yun Li; Xiang Chen; Xu Zhang; Kongqiao Wang; Z.J. Wang

Identification of constituent components of each sign gesture can be beneficial to the improved performance of sign language recognition (SLR), especially for large-vocabulary SLR systems. Aiming at developing such a system using portable accelerometer (ACC) and surface electromyographic (sEMG) sensors, we propose a framework for automatic Chinese SLR at the component level. In the proposed framework, data segmentation, as an important preprocessing operation, is performed to divide a continuous sign language sentence into subword segments. Based on the features extracted from ACC and sEMG data, three basic components of sign subwords, namely the hand shape, orientation, and movement, are further modeled and the corresponding component classifiers are learned. At the decision level, a sequence of subwords can be recognized by fusing the likelihoods at the component level. The overall classification accuracy of 96.5% for a vocabulary of 120 signs and 86.7% for 200 sentences demonstrate the feasibility of interpreting sign components from ACC and sEMG data and clearly show the superior recognition performance of the proposed method when compared with the previous SLR method at the subword level. The proposed method seems promising for implementing large-vocabulary portable SLR systems.


international conference on medical biometrics | 2008

Research on gesture definition and electrode placementin pattern recognition of hand gesture action SEMG

Xu Zhang; Xiang Chen; Zhangyan Zhao; Youqiang Tu; Ji-hai Yang; Vuokko Lantz; Kongqiao Wang

The goal of this study is to explore the effects of electrode placement on the hand gesture pattern recognition performance. We have conducted experiments with surface EMG sensors using two detecting electrode channels. In total 25 different hand gestures and 10 different electrode positions for measuring muscle activities have been evaluated. Based on the experimental results, dependencies between surface EMG signal detection positions and hand gesture recognition performance have been analyzed and summarized as suggestions how to define hand gestures and select suitable electrode positions for a myoelectric control system. This work provides useful insight for the development of a medical rehabilitation system based on EMG technique.


international conference of the ieee engineering in medicine and biology society | 2011

Interpreting sign components from accelerometer and sEMG data for automatic sign language recognition

Yun Li; Xiang Chen; Xu Zhang; Kongqiao Wang; Ji-hai Yang

The identification of constituent components of each sign gesture is a practical way of establishing large-vocabulary sign language recognition (SLR) system. Aiming at developing such a system using portable accelerometer (ACC) and surface electromyographic (sEMG) sensors, this work proposes a method for automatic SLR at the component level. The preliminary experimental results demonstrate the effectiveness of the proposed method and the feasibility of interpreting sign components from ACC and sEMG data. Our study improves the performance of SLR based on ACC and sEMG sensors and will promote the realization of a large-vocabulary portable SLR system.


international conference on bioinformatics and biomedical engineering | 2010

A Method of Hand Gesture Recognition Based on Multiple Sensors

Wei Fan; Xiang Chen; Wen-hui Wang; Xu Zhang; Ji-hai Yang; Vuokko Lantz; Kongqiao Wang

This paper presents a new method of gesture recognition based on multiple sensors fusion technique. Three kinds of sensors, namely surface Electromyography (sEMG) sensor, 3-axis accelerometer (ACC) and camera, are used together to capture the dynamic hand gesture firstly. Then four types of features are extracted from the three kinds of sensory data to depict the static hand posture and dynamic gesture trajectory characteristics of hand gesture. Finally decision-level multi-classifier fusion method is implemented for hand gesture pattern classification. Experimental results of 4 subjects demonstrate that each kind of sensor data has its advantages and disadvantages in representing hand gestures. And the proposed method could fuse effectively the complementary information from these three types of sensors for dynamic hand gesture recognition.


international conference on intelligent computing | 2007

Study on online gesture sEMG recognition

Zhangyan Zhao; Xiang Chen; Xu Zhang; Ji-Hai Yang; Youqiang Tu; Vuokko Lantz; Kongqiao Wang

We have realized an online gesture recognition platform for hand gestures using 2-channel surface EMG signals acquired from the forearm. Several features, such as AMV, AMV ratio and fourth-order AR model coefficients are extracted from the sEMG signal and the gesture segments are recognized with a Weighted Euclidean Distance Classifier. An above 90% recognition rate has been achieved with only a 400 µs recognition time. The methods developed in this study are aimed to be applied in a fast-response sEMG control system and be transplanted into an embedded microprocessor system.

Collaboration


Dive into the Kongqiao Wang's collaboration.

Top Co-Authors

Avatar

Xiang Chen

University of Science and Technology of China

View shared research outputs
Top Co-Authors

Avatar

Xu Zhang

University of Science and Technology of China

View shared research outputs
Top Co-Authors

Avatar

Ji-hai Yang

University of Science and Technology of China

View shared research outputs
Top Co-Authors

Avatar

Zhangyan Zhao

University of Science and Technology of China

View shared research outputs
Top Co-Authors

Avatar

Yun Li

University of Science and Technology of China

View shared research outputs
Top Co-Authors

Avatar

Ji-Hai Yang

University of Science and Technology of China

View shared research outputs
Top Co-Authors

Avatar

Wen-hui Wang

University of Science and Technology of China

View shared research outputs
Top Co-Authors

Avatar

Youqiang Tu

University of Science and Technology of China

View shared research outputs
Top Co-Authors

Avatar

Xu Zhang

University of Science and Technology of China

View shared research outputs
Researchain Logo
Decentralizing Knowledge