2021 Moratuwa Engineering Research Conference (MERCon) | 2021

Vision-EMG Fusion Method for Real-time Grasping Pattern Classification System

 
 

Abstract


Although recently developed Electromyography-based (EMG) prosthetic hands could classify a significant amount of wrist motions, classifying 5-6 grasping patterns in real-time is a challenging task. The collaboration of EMG and vision has addressed this problem to a certain extent but could not achieve significant performance in real-time. In this paper, we propose a fusion method that can improve the real-time prediction accuracy of the EMG system by merging a probability matrix that represents the usage of the six grasping patterns for the targeted object. The YOLO object detection system retrieves a probability matrix of the identified object, and it is used to correct the error in the EMG classification system. The experiments revealed that the optimized ANN model outperformed the KNN, LDA, NB, and DT by achieving the highest mean True Positive Rate (mTPR) of 69.34%(21.54) in real-time for all the six grasping patterns. Furthermore, the proposed feature set (Age, Gender, and Handedness of the user) showed that their influence increases the mTPR of ANN by 16.05%(2.70). The proposed system takes 393.89ms(178.23ms) to produce a prediction. Therefore, the user does not feel a delay between intention and execution. Furthermore, the system facilitates users to use multiple-grasping patterns for an object.

Volume None
Pages 585-590
DOI 10.1109/MERCon52712.2021.9525702
Language English
Journal 2021 Moratuwa Engineering Research Conference (MERCon)

Full Text