DSpace Repository

Prosthetic hand based on human hand anatomy controlled by surface electromyography and artificial neural network

Show simple item record

dc.contributor.author DUNAI, Larisa
dc.contributor.author VERDÚ, Isabel Seguí
dc.contributor.author TURCANU, Dinu
dc.contributor.author BOSTAN, Viorel
dc.date.accessioned 2025-07-21T10:29:12Z
dc.date.available 2025-07-21T10:29:12Z
dc.date.issued 2025
dc.identifier.citation DUNAI, Larisa; Isabel Seguí VERDÚ; Dinu TURCANU and Viorel BOSTAN. Prosthetic hand based on human hand anatomy controlled by surface electromyography and artificial neural network. Technologies. 2025, vol. 13, nr. 1, art. nr. 21. ISSN 2227-7080. en_US
dc.identifier.issn 2227-7080
dc.identifier.uri https://doi.org/10.3390/technologies13010021
dc.identifier.uri https://repository.utm.md/handle/5014/32899
dc.description Access full text: https://doi.org/10.3390/technologies13010021 en_US
dc.description.abstract Humans have a complex way of expressing their intuitive intentions in real gestures. That is why many gesture detection and recognition techniques have been studied and developed. There are many methods of human hand signal reading, such as those using electroencephalography, electrocorticography, and electromyography, as well as methods for gesture recognition. In this paper, we present a method based on real-time surface electroencephalography hand-based gesture recognition using a multilayer neural network. For this purpose, the sEMG signals have been amplified, filtered and sampled; then, the data have been segmented, feature extracted and classified for each gesture. To validate the method, 100 signals for three gestures with 64 samples each signal have been recorded from 2 users with OYMotion sensors and 100 signals for three gestures from 4 users with the MyWare sensors. These signals were used for feature extraction and classification using an artificial neuronal network. The model converges after 10 sessions, achieving 98% accuracy. As a result, an algorithm was developed that aimed to recognize two specific gestures (handling a bottle and pointing with the index finger) in real time with 95% accuracy. en_US
dc.language.iso en en_US
dc.publisher Multidisciplinary Digital Publishing Institute (MDPI) en_US
dc.rights Attribution-NonCommercial-NoDerivs 3.0 United States *
dc.rights.uri http://creativecommons.org/licenses/by-nc-nd/3.0/us/ *
dc.subject prosthetic hand en_US
dc.subject bionic hand en_US
dc.subject gesture recognition en_US
dc.subject feature extraction en_US
dc.subject classification en_US
dc.title Prosthetic hand based on human hand anatomy controlled by surface electromyography and artificial neural network en_US
dc.type Article en_US


Files in this item

The following license files are associated with this item:

This item appears in the following Collection(s)

Show simple item record

Attribution-NonCommercial-NoDerivs 3.0 United States Except where otherwise noted, this item's license is described as Attribution-NonCommercial-NoDerivs 3.0 United States

Search DSpace


Advanced Search

Browse

My Account