Please use this identifier to cite or link to this item:
|Title:||Real-time static gesture detection using machine learning|
|Keywords:||Sign gestures;image processing,;machine learning;conventional neural network.|
|Abstract:||Sign gesture recognition is an important problem in human-computer interaction with significant societal influence. However, it is a very complicated task, since sign gestures are naturally deformable objects. Gesture recognition contains unsolved problems for the last two decades, such as low accuracy or low speed, and despite many proposed methods, no perfect result has been found to explain these unsolved problems. In this thesis, we suggest a machine learning approach to translating sign gesture language into text. In this study, we have introduced a self-generated image data set for American Sign Language (ASL). This dataset is a collection of 36 characters containing alphabets A to Z and numeric digits 0 to 9. The proposed system can recognize static gestures. This system can learn and classify specific sign gesture of any person. We used a convolutional neural network (CNN) algorithm for the classification of the images to text. An accuracy of 99.00% was achieved on the alphabet gestures and 100% accuracy on digits.|
|Appears in Collections:||Computational Sciences - Master's theses|
Items in LU|ZONE|UL are protected by copyright, with all rights reserved, unless otherwise indicated.