Vis enkel innførsel

dc.contributor.advisorTufte, Gunnar
dc.contributor.advisorNichele, Stefano
dc.contributor.authorChau, Tony
dc.date.accessioned2017-10-09T14:00:25Z
dc.date.available2017-10-09T14:00:25Z
dc.date.created2017-06-12
dc.date.issued2017
dc.identifierntnudaim:16780
dc.identifier.urihttp://hdl.handle.net/11250/2459262
dc.description.abstractAbout 70 million deaf people use sign language as their first language or mother tongue, but the lack of a common language between the deaf and hearing individuals makes the general communication difficult. This thesis aims to explore the potential of utilizing electromyography to improve the general communication for deaf people. The Myo armband, developed by Thalmic Labs, is a wearable gesture and motion control device that use a set of electromyographic sensors, combined with a gyroscope, accelerometer and magnetometer, to detect movements and gestures. This thesis presents a development of a prototype-level system that utilize the Myo armband's electromyographic sensors to detect and translate sign language signs to something intelligible for the hearing individuals. Based on the previous work and the associated framework developed for gesture recognition using the Inertial Measurement Units (IMU) and Electromyography (EMG) sensors from the Myo armband, this thesis focuses on the EMG feature extraction and using machine learning for gestures classification. This thesis propose a framework for gesture recognition, which achieved an accuracy of 85% for 10 different gestures.
dc.languageeng
dc.publisherNTNU
dc.subjectDatateknologi, Komplekse datasystemer
dc.titleMachine Learning for Gesture Recognition with Electromyography
dc.typeMaster thesis


Tilhørende fil(er)

Thumbnail
Thumbnail

Denne innførselen finnes i følgende samling(er)

Vis enkel innførsel