Machine Learning for Gesture Recognition with Electromyography
MetadataShow full item record
About 70 million deaf people use sign language as their first language or mother tongue, but the lack of a common language between the deaf and hearing individuals makes the general communication difficult. This thesis aims to explore the potential of utilizing electromyography to improve the general communication for deaf people. The Myo armband, developed by Thalmic Labs, is a wearable gesture and motion control device that use a set of electromyographic sensors, combined with a gyroscope, accelerometer and magnetometer, to detect movements and gestures. This thesis presents a development of a prototype-level system that utilize the Myo armband's electromyographic sensors to detect and translate sign language signs to something intelligible for the hearing individuals. Based on the previous work and the associated framework developed for gesture recognition using the Inertial Measurement Units (IMU) and Electromyography (EMG) sensors from the Myo armband, this thesis focuses on the EMG feature extraction and using machine learning for gestures classification. This thesis propose a framework for gesture recognition, which achieved an accuracy of 85% for 10 different gestures.