Development of a real-time grasping pattern classification system by fusing EMG-vision for hand prostheses

dc.contributor.advisorPunchihewa HKG
dc.contributor.advisorMadusanka DGK
dc.contributor.authorPerera GDM
dc.date.accept2021
dc.date.accessioned2021
dc.date.available2021
dc.date.issued2021
dc.description.abstractThe Electromyography (EMG) based trans-radial prostheses have revolutionized the prosthetic industry due to their ability to control the robotic hand using human intention. Although recently developed EMG-based prosthetic hands can classify a signi cant number of wrist motions, classifying grasping patterns in real-time is challenging. However, the wrist motions alone cannot facilitate a prosthetic hand to grasp objects properly without performing appropriate grasping pattern. The collaboration of EMG and vision has addressed this problem to a certain extent. However they have not been able to achieve signi cant performance in real-time. This study proposed a vision-EMG fusion method that can improve the real-time prediction accuracy of the EMG classi cation system by merging a probability matrix that represents the usage of the six grasping patterns for the targeted object. The You Only Look Once (YOLO) object detection algorithm was utilized to retrieve the probability matrix of the identi ed object, and it was used to correct the classi cation error in the EMG classi cation system by applying Bayesian fusion. Experiments were carried out to collect EMG data from six muscles of 15 subjects during the grasping action for classi er development. In addition, an online survey was conducted to collect data to calculate the respective conditional probability matrix for selected objects. Finally, the ve optimized supervised learning EMG classi ers; Arti cial Neural Network (ANN), K-nearest neighbor (KNN), Linear Discriminant Analysis (LDA), Naive Bayes (NB), and Decision Tree (DT) were compared to select the best classi er for fusion. The real-time experiment results revealed that the ANN outperformed other selected classi ers by achieving the highest mean True Positive Rate (mTPR) of M = 72:86% (SD = 17:89%) for all six grasping patterns. Furthermore, the feature set identi ed at the experiment (Age, Gender, and Handedness of the user) proved that their in uence increases the mTPR of ANN by M = 16:05% (SD = 2:70%). The proposed system takes M = 393:89 ms (SD = 178:23 ms) to produce a prediction. Therefore, the user did not feel a delay between intention and execution. Furthermore, proposed system facilitated the user to use suitable multiple grasping patterns for a single object as in real life. In future research works, the functionalities of the system should be expanded to include wrist motions and evaluate the system on amputees.en_US
dc.identifier.accnoTH5086en_US
dc.identifier.citationPerera, G.D.M. (2021). Development of a real-time grasping pattern classification system by fusing EMG-vision for hand prostheses [Master's theses, University of Moratuwa]. Institutional Repository University of Moratuwa. http://dl.lib.uom.lk/handle/123/22460
dc.identifier.degreeMSc in Mechanical Engineering by researchen_US
dc.identifier.departmentDepartment of Mechanicall Engineeringen_US
dc.identifier.facultyEngineeringen_US
dc.identifier.urihttp://dl.lib.uom.lk/handle/123/22460
dc.language.isoenen_US
dc.subjectSURFACE ELECTROMYOGRAPHYen_US
dc.subjectREAL-TIME CLASSIFICATIONen_US
dc.subjectGRASPING PATTERNen_US
dc.subjectSENSOR FUSIONen_US
dc.subjectVISION FEED- BACKen_US
dc.subjectMECHANICAL ENGINEERING- Dissertationen_US
dc.titleDevelopment of a real-time grasping pattern classification system by fusing EMG-vision for hand prosthesesen_US
dc.typeThesis-Abstracten_US

Files