Hybrid Vision Based Reach-to-Grasp Task Planning Method for Trans-Humeral Prostheses

dc.contributor.authorMadusanka, DGK
dc.contributor.authorGopura, RARC
dc.contributor.authorAmarasinghe, YWR
dc.contributor.authorMann, GKI
dc.date.accessioned2023-03-17T03:48:16Z
dc.date.available2023-03-17T03:48:16Z
dc.date.issued2017
dc.description.abstractThis paper proposes a hybrid vision based reachto- grasp task planning method for trans-humeral prostheses exploiting both vision and Electromyography (EMG) signals. The hybrid method mainly consists of 2-1/2D visual servoing module and EMG based module. The visual servoing intends to align the object on to the center of the palm while correcting its orientation. EMG signals extracted from the remaining muscles of the disabled arm due to amputation are used to control the elbow flexion/extension (FE). While using the 2-1/2D visual servoing module, the object reaching algorithm changes the elbow FE angle to reach the palm towards the object of interest. Initially, the EMG based module controls the elbow FE. Once an object is detected, the EMG signals emanating from the arm muscles generates a reach request. This process then activates the visual servoing module to bring the palm towards the object. Since both EMG based module and the visual servoing module are producing elbow FE angles while reaching towards an object, these two modules are integrated to obtain a resultant angle for elbow FE. Experiments are conducted using a simulation environment and a prosthesis to validate the proposed task planning method. The EMG based module is capable of following the natural elbow FE motion. Moreover, the task planning method is capable of driving the prosthesis towards the object with proper orientation.en_US
dc.identifier.citationKanishka Madusanka, D. G., Gopura, R. A. R. C., Amarasinghe, Y. W. R., & Mann, G. K. I. (2017). Hybrid Vision Based Reach-to-Grasp Task Planning Method for Trans-Humeral Prostheses. IEEE Access, 5, 16149–16161. https://doi.org/10.1109/ACCESS.2017.2727502en_US
dc.identifier.databaseIEEE Xploreen_US
dc.identifier.doi10.1109/ACCESS.2017.2727502en_US
dc.identifier.issn2169-3536(Online)en_US
dc.identifier.journalIEEE Accessen_US
dc.identifier.pgnos16149 - 16161en_US
dc.identifier.urihttp://dl.lib.uom.lk/handle/123/20755
dc.identifier.volume5en_US
dc.identifier.year2017en_US
dc.language.isoenen_US
dc.publisherIEEEen_US
dc.subjectProsthesisen_US
dc.subjectElectromyographyen_US
dc.subject2-1/2D visual servoingen_US
dc.titleHybrid Vision Based Reach-to-Grasp Task Planning Method for Trans-Humeral Prosthesesen_US
dc.typeArticle-Full-texten_US

Files