Graduation Year

2018

Document Type

Thesis

Degree

M.S.

Degree Name

Master of Science (M.S.)

Degree Granting Department

Computer Science and Engineering

Major Professor

Shaun Canavan, Ph.D.

Committee Member

Paul A. Rosen, Ph.D.

Committee Member

Marvin Andujar, Ph.D.

Keywords

Affective Computing, BP4D+, DEAP, Deep Learning

Abstract

Classification of emotions plays a very important role in affective computing and has real-world applications in fields as diverse as entertainment, medical, defense, retail, and education. These applications include video games, virtual reality, pain recognition, lie detection, classification of Autistic Spectrum Disorder (ASD), analysis of stress levels, and determining attention levels. This vast range of applications motivated us to study automatic emotion recognition which can be done by using facial expression, speech, and physiological data.

A person’s physiological signals such are heart rate, and blood pressure are deeply linked with their emotional states and can be used to identify a variety of emotions; however, they are less frequently explored for emotion recognition compared to audiovisual signals such as facial expression and voice. In this thesis, we investigate a multimodal approach to emotion recognition using physiological signals by showing how these signals can be combined and used to accurately identify a wide range of emotions such as happiness, sadness, and pain. We use the deep convolutional neural network for our experiments. We also detail comparisons between gender-specific models of emotion. Our investigation makes use of deep convolutional neural networks, which are the latest state of the art in supervised learning, on two publicly available databases, namely DEAP and BP4D+. We achieved an average emotion recognition accuracy of 98.89\% on BP4D+ and on DEAP it is 86.09\% for valence, 90.61\% for arousal, 90.48\% for liking and 90.95\% for dominance. We also compare our results to the current state of the art, showing the superior performance of our method.

Share

COinS