Recognizing emotions in real time

Webinar
Facebook IconFacebook IconFacebook Icon
Mar 19, 2024

Sagacify offers its interns and employees opportunities to experiment with different AI techniques resulting in fun projects for the team. This stimulates creativity and a positive atmosphere in the work environment, but also lead to new AI opportunities and applications. One of them is project SMILE which provides a model that is able to detect and recognize emotions.

Emotions make us human

The objective of this project was to develop a model that recognizes emotions. In its essence, every person has six basic emotions: anger, disgust, fear, happiness, sadness, and surprise. These emotions play an essential role in the way how people make decisions and manage personal relationships, while they also impact their well-being. As humans already perceive emotions as difficult tasks, imagine if an artificial intelligence model would have to do it. Humans are able to decipher these emotions in real-time and interpret them in a subjective manner based on its past encounters, experience, and feelings.

How to dissect human emotions

The project required 3 significant steps to get to the final product, namely:

  1. Establishing facial detection by the model
  2. Continuing with a detection mechanism that will recognize the six basic emotions
  3. Finalizing by training the model to recognize these emotions in real-time

In order to complete these steps, the project is supported by two models, one that was pre-trained carrying three cascaded convolutional layers for facial detection and another pre-trained model equipped with a deep convolutional neural network (CNN) type VGG16 (16 layers).

Facial detection

For the first model, a dataset was created containing around 3000 images. These images formed the basis for the data exploration phase followed by training a YOLO (You-Only-Look-Once) model - a real-time object detection system - for the purpose of detecting faces on the given images. Once these phases were completed, the results were evaluated based on criteria such as execution time, precision, recall and intersection over union.

Emotion detection

The second model was fed with a pre-labeled dataset rich of RGB images and video clips providing the system with material that consists of the six basic emotions that humans possess. A classifier with a deep CNN was trained on this dataset to predict the emotions on the given images by using data augmentation.

These two models were needed to detect emotions in real-time. Furthermore, video clips or livestream webcam videos were added. With the help of the facial detection model, each face in every frame of the moving video material was identified and extracted. At the moment these faces were collected, the emotion detection model performed a scan on these facial expressions and delivered the proper emotion that was displayed on the face.

Pursuing happy faces

Along the way, the project encountered some struggles when analyzing thousands of facial expressions. Especially difficult was the simultaneous scan of multiple faces in real-time. This could be attributed to the lack of high-quality videos or images that could be used to train the model. These struggles were eventually solved which resulted in the desired model. Nevertheless, in a pursuit to achieve perfection, some changes could improve the model ensuring better results. Reasons that cause still trouble are an insufficiently balanced dataset, issues with predicting emotions such as fear and anger and the like.

SMILE in many ways

Project SMILE offers opportunities in many fields. Emotional data could lead to advances in domains of marketing, healthcare and traffic safety. Marketing wise, the technology could be used to evaluate crowd feelings in many situations. For instance, how does an audience feel during a movie, political debate or during a conference? Emotions expose how people feel at the moment of a purchase which, if positive, contributes significantly to the reputation of a product, service or brand. Additionally, it provides companies with valuable information to assist market studies and customer satisfaction reviews. In the context of healthcare, this AI model could monitor the emotional state of people during medical procedures. Finally, it could alert certain behavior, such as tiredness or aggressiveness of drivers, in traffic that might result in high-risk situations. Therefore, this will improve the response time of the authorities or even lead to the implementation of preventive measures.

The applications are endless, and while continuing on the already great work done, Sagacify will continue to strengthen and perfecting this technology. We hope to help companies find ways to implement the emotion detection system into their business.