Exploring Emotion Recognition of Students in Virtual Reality Classrooms Through Convolutional Neural Networks and Transfer Learning Techniques
dc.contributor.advisor | Zhao, Richard | |
dc.contributor.author | Shomoye, Michael Abidemi | |
dc.contributor.committeemember | Farhad, Maleki | |
dc.contributor.committeemember | Usman, Alim | |
dc.date | 2024-06 | |
dc.date.accessioned | 2024-01-19T15:51:15Z | |
dc.date.available | 2024-01-19T15:51:15Z | |
dc.date.issued | 2024-01-15 | |
dc.description.abstract | In contemporary educational settings, understanding and assessing student engagement through non-verbal cues, especially facial expressions, is pivotal. Such cues have long informed educators about students' cognitive and emotional states, assisting them in tailoring their teaching methods. However, the rise of online learning platforms and advanced technologies like Virtual Reality (VR) challenge the conventional modes of gauging student engagement, especially when certain facial features become obscured or are entirely absent. This paper explores the potential of Convolutional Neural Networks (CNNs), specifically a customized training approach model adapted from the ResNet50 architecture, in recognizing and distinguishing subtle facial expressions in real-time, such as neutrality, boredom, happiness, and confusion. The novelty of our approach is twofold: First, we optimize the power of CNNs to analyze facial expressions in digital learning platforms. Second, we innovate for the context of VR by focusing on the lower half of the face to tackle occlusion challenges posed by wearing VR headsets. Through comprehensive experimentation, we compare our model's performance with the default Residual Neural Network 50 (ResNet50) and evaluate it against full-face and VR-occluded face datasets. Ultimately, our endeavor aims to provide educators with a sophisticated tool for real-time evaluation of student engagement in technologically advanced learning environments, subsequently enriching the teaching and learning experience. | |
dc.identifier.citation | Shomoye, M. A. (2024). Exploring emotion recognition of students in Virtual Reality classrooms through Convolutional Neural Networks and transfer learning techniques (Master's thesis, University of Calgary, Calgary, Canada). Retrieved from https://prism.ucalgary.ca. | |
dc.identifier.uri | https://hdl.handle.net/1880/117972 | |
dc.identifier.uri | https://doi.org/10.11575/PRISM/42816 | |
dc.language.iso | en | |
dc.publisher.faculty | Graduate Studies | |
dc.publisher.institution | University of Calgary | |
dc.rights | University of Calgary graduate students retain copyright ownership and moral rights for their thesis. You may use this material in any way that is permitted by the Copyright Act or through licensing that has been assigned to the document. For uses that are not allowable under copyright legislation or licensing, you are required to seek permission. | |
dc.subject | emotion recognition | |
dc.subject | facial expression | |
dc.subject | VR | |
dc.subject | virtual reality | |
dc.subject | virtual learning environment | |
dc.subject | machine learning | |
dc.subject | deep learning | |
dc.subject | CNN | |
dc.subject | Convolutional Neural Networks | |
dc.subject | transfer learning | |
dc.subject | ResNet50 | |
dc.subject | VGG19 | |
dc.subject | MobileNet | |
dc.subject.classification | Computer Science | |
dc.subject.classification | Artificial Intelligence | |
dc.title | Exploring Emotion Recognition of Students in Virtual Reality Classrooms Through Convolutional Neural Networks and Transfer Learning Techniques | |
dc.type | master thesis | |
thesis.degree.discipline | Computer Science | |
thesis.degree.grantor | University of Calgary | |
thesis.degree.name | Master of Science (MSc) | |
ucalgary.thesis.accesssetbystudent | I do not require a thesis withhold – my thesis will have open access and can be viewed and downloaded publicly as soon as possible. |