Please use this identifier to cite or link to this item: https://zone.biblio.laurentian.ca/handle/10219/3468
Title: Real-time static gesture detection using machine learning
Authors: Goswami, Sandipgiri
Keywords: Sign gestures;image processing,;machine learning;conventional neural network.
Issue Date: 24-Apr-2019
Abstract: Sign gesture recognition is an important problem in human-computer interaction with significant societal influence. However, it is a very complicated task, since sign gestures are naturally deformable objects. Gesture recognition contains unsolved problems for the last two decades, such as low accuracy or low speed, and despite many proposed methods, no perfect result has been found to explain these unsolved problems. In this thesis, we suggest a machine learning approach to translating sign gesture language into text. In this study, we have introduced a self-generated image data set for American Sign Language (ASL). This dataset is a collection of 36 characters containing alphabets A to Z and numeric digits 0 to 9. The proposed system can recognize static gestures. This system can learn and classify specific sign gesture of any person. We used a convolutional neural network (CNN) algorithm for the classification of the images to text. An accuracy of 99.00% was achieved on the alphabet gestures and 100% accuracy on digits.
URI: https://zone.biblio.laurentian.ca/handle/10219/3468
Appears in Collections:Computational Sciences - Master's theses
Master's Theses

Files in This Item:
File Description SizeFormat 
Thesis-Sandip Goswami.pdf1.72 MBAdobe PDFThumbnail
View/Open


Items in LU|ZONE|UL are protected by copyright, with all rights reserved, unless otherwise indicated.