Please use this identifier to cite or link to this item: https://zone.biblio.laurentian.ca/handle/10219/3468
Full metadata record
DC FieldValueLanguage
dc.contributor.authorGoswami, Sandipgiri-
dc.date.accessioned2020-04-02T13:35:50Z-
dc.date.available2020-04-02T13:35:50Z-
dc.date.issued2019-04-24-
dc.identifier.urihttps://zone.biblio.laurentian.ca/handle/10219/3468-
dc.description.abstractSign gesture recognition is an important problem in human-computer interaction with significant societal influence. However, it is a very complicated task, since sign gestures are naturally deformable objects. Gesture recognition contains unsolved problems for the last two decades, such as low accuracy or low speed, and despite many proposed methods, no perfect result has been found to explain these unsolved problems. In this thesis, we suggest a machine learning approach to translating sign gesture language into text. In this study, we have introduced a self-generated image data set for American Sign Language (ASL). This dataset is a collection of 36 characters containing alphabets A to Z and numeric digits 0 to 9. The proposed system can recognize static gestures. This system can learn and classify specific sign gesture of any person. We used a convolutional neural network (CNN) algorithm for the classification of the images to text. An accuracy of 99.00% was achieved on the alphabet gestures and 100% accuracy on digits.en_US
dc.language.isoenen_US
dc.subjectSign gesturesen_US
dc.subjectimage processing,en_US
dc.subjectmachine learningen_US
dc.subjectconventional neural network.en_US
dc.titleReal-time static gesture detection using machine learningen_US
dc.typeThesisen_US
dc.description.degreeMaster of Science (MSc) in Computational Sciencesen_US
dc.publisher.grantorLaurentian University of Sudburyen_US
Appears in Collections:Computational Sciences - Master's theses
Master's Theses

Files in This Item:
File Description SizeFormat 
Thesis-Sandip Goswami.pdf1.72 MBAdobe PDFThumbnail
View/Open


Items in LU|ZONE|UL are protected by copyright, with all rights reserved, unless otherwise indicated.