Bi-Modal Deep Neural Network for Gait Emotion Recognition

dc.contributor.advisorGavrilova, Marina
dc.contributor.authorBhatia, Yajurv
dc.contributor.committeememberJacobson, Michael
dc.contributor.committeememberRunions, Adam
dc.date2023-02
dc.date.accessioned2022-11-25T20:31:36Z
dc.date.available2022-11-25T20:31:36Z
dc.date.issued2022-11-23
dc.description.abstractEmotion Recognition systems can be used for autonomous tasks such as video gaming experiences, medical diagnosis, adaptive education, and smart homes. Several biometric modalities, including face, hands, and voice have been successfully used for emotion recognition tasks. Gait Emotion Recognition (GER) is an emerging domain of research that is focused on identifying the emotional state of a person from gait biometric, which represents the person’s manner of walking. In comparison to the other modalities, gait provides a non-intrusive method to collect data remotely without an expert’s supervision. Moreover, unlike facial expression-based emotion recognition, it does not require high-resolution data for inference. Early works in GER produced limited feature sets and used classical machine learning methodologies to infer emotions, but could not achieve high performance. This thesis proposes powerful architectures based on deep-learning to accurately identify emotions from human gaits. The proposed Bi-Modal Deep Neural Network (BMDNN) architecture utilizes robust handcrafted features that are independent of dataset size and data distribution. The network is based on Long Short-Term Memory units and Multi-Layered Perceptrons to sequentially process raw gait sequences and facilitate feature fusion with the handcrafted features. Lastly, the proposed Bi-Modular Sequential Neural Network (BMSNN) has a low number of parameters and a low inference time, hence making it suitable for deployment in real world applications. The proposed methodologies were evaluated on the Edinburgh Locomotive MoCap Dataset and outperformed all recent state-of-the-art methods.en_US
dc.identifier.citationBhatia, Y. (2022). Bi-Modal Deep Neural Network for Gait Emotion Recognition (Master's thesis, University of Calgary, Calgary, Canada). Retrieved from https://prism.ucalgary.ca.en_US
dc.identifier.urihttp://hdl.handle.net/1880/115541
dc.identifier.urihttps://dx.doi.org/10.11575/PRISM/40498
dc.language.isoengen_US
dc.publisher.facultyScienceen_US
dc.publisher.institutionUniversity of Calgaryen
dc.rightsUniversity of Calgary graduate students retain copyright ownership and moral rights for their thesis. You may use this material in any way that is permitted by the Copyright Act or through licensing that has been assigned to the document. For uses that are not allowable under copyright legislation or licensing, you are required to seek permission.en_US
dc.subjectDeep Learningen_US
dc.subjectCognitive Systemsen_US
dc.subjectLong Short-Term Memory (LSTM)en_US
dc.subjectAffective Computingen_US
dc.subjectEmotion Recognitionen_US
dc.subjectSituation Awarenessen_US
dc.subjectGaiten_US
dc.subjectRemote Visual Technologyen_US
dc.subjectMotion Capture Sensoren_US
dc.subjectHuman Motionen_US
dc.subjectHandcrafted Featuresen_US
dc.subjectFeature Fusionen_US
dc.subject.classificationEducation--Sciencesen_US
dc.subject.classificationArtificial Intelligenceen_US
dc.subject.classificationComputer Scienceen_US
dc.subject.classificationRoboticsen_US
dc.subject.classificationPsychology--Behavioralen_US
dc.subject.classificationPsychology--Physiologicalen_US
dc.titleBi-Modal Deep Neural Network for Gait Emotion Recognitionen_US
dc.typemaster thesisen_US
thesis.degree.disciplineComputer Scienceen_US
thesis.degree.grantorUniversity of Calgaryen_US
thesis.degree.nameMaster of Science (MSc)en_US
ucalgary.item.requestcopytrueen_US
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
ucalgary_2022_bhatia_yajurv.pdf
Size:
7.56 MB
Format:
Adobe Portable Document Format
Description:
Main article
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
2.62 KB
Format:
Item-specific license agreed upon to submission
Description: