contact@bayslope.com | business@bayslope.com

+1 646.349.9293 (US) +44 139 258 1535 (UK) +91 844.775.1586 (IN)

ARTICLES

Home | ARTICLE | Artificial Intelligence | Can Artificial Intelligence be used to detect Human Emotions?

Can Artificial Intelligence be used to detect Human Emotions?

Nature has provided humans with an innate ability to understandthe emotions of other human beings. However, the same is not true in the context of machines.‘Emotion AI’ also known as ‘Artificial Emotional Intelligence’ refers to technologies that use Artificial Intelligence (AI) techniques for machines to learn about and interact with human emotions.

There are 6 widely knownhuman emotions namely anger, happiness, sadness, disgust, fear, and surprise. Emotion-detection techniques analyzes people’s facial expressionsby noticing changes in the facial nodal pointsand movements to identifytheir emotions.

In the upcoming decade, AI based Emotion Detection will be widelyused in our everyday interactions with people and objects. Hence, it becomes imperative to find more about this intriguing technology.

A lot of time, money, and energy is being invested in this field of Artificial Emotional Detection. So, the question arises- Why do we need to detect emotions?

The answer to this question lies in the applications of Emotion Detection. For instance, with a reliable understanding of the customers’ emotions and reactions to a service, a company can modify its marketing strategies. Companies like Affectiva Inc. and Eyeris have come up with algorithms that can identify the consumers’ reactions and emotions. On similar lines, Lightbulb.ai is a platform that gets emotion & attention analytics in real-time, across geographies. It can also help gaming companies to identify if their game can keep the gamers’ attention for long enough. Similarly, the technology can be used in the field of Education to track boredom amongst students and thus, identifying which teaching methods are most effective. As an example, a high school in Hangzhou City, China is using facial recognition technology developed by Hikvision Digital Technology to monitor students’ attentiveness.

Artificial Emotion Detection is also being used to make our homes smarter. For instance, Providence 3.0 (an initiative the Infinite 8 AI) is claimed to be the world’s smartest AI Assistant. This model has an additional update of identifying ‘pain’ as well. Furthermore, the most prominent use of this technology is in the health industry. Companies like Brainpower are using emotion detection to teach Autistic children the basics of emotions and helping them interact withother people. Similarly, the emotions of patients can be detected and the concerned doctors can be informed about the progress of their health facilities.

In addition, the AI technology to detect emotions can have its application in detecting situations of conflict, to make our existing chatbots like Siri and Alexa more empathetic.

Emotion Detection through Facial Recognition

Scientists, in the 1960s,started working on enabling computers to recognize faces. And facial recognition has made significant growth since then. According to a study, by 2024, the global facial recognition market would generate $7billion of revenue.Tech giants like Google, Facebook, IBM, Microsoft, etc. have all been sailing in this boat for more than 5 years now. In 2015, Google released FaceNet with a new record accuracy of 99.63%  (0.9963 ± 0.0009). Evidently, we all are familiar with the ‘face recognition’ feature of Google Photos and have been using it extensively. Similarly,Facebook launched its DeepFace program(2014) which has an accuracy rate of 97.25%.

Currently, facial recognition is also being used to fight COVID. For example, Datakalab just released a new update of its face mask algorithm to analyze if the mask is well positioned on a person’s face. The goal of this product is to statistically quantify when and where face masks are well worn to improve public health.

Snap Inchas introduced use of emotion detection to enhance productivity in a corporate environment. In one of its patents (Patent Number US10496947B1), introduces methods and systems to optimize workforce analytics through emotion recognition over video calls in organizations like call centres, etc. It explains, “Moreover, contact centres typically stress that their employees stay emotionally positive when talking to customers, smile, and respond with respect and gratitude to whatever question a customer could ask. An employee’s positive attitude can be easily affected by many factors such as tiredness, anger, emotional distress, and confusion. An angry and distressed employee may not provide high-quality service to a customer, which ultimately may cause a reduction in revenue, negative customer feedback, and loss of clients, among other factors. Accordingly, employers may want to introduce control processes to monitor the emotional status of their employees regularly to ensure they stay positive and provide high-quality service”.

Thus, this system enables employers to keep a check on their employees. An extension of this idea by Snap Inc is included in Patent Number US10599917B1 which states a method to recognize changes in emotions of the customer over video calls in order to provide them better service.

Emotion Detection through Smart Textiles

Who knew that a cloth band on our forehead could be capable of detecting our emotions?Bo Zhou, Tandra Ghose, and Paul Lukowicz have come up with the Expressure system that performs surface pressure mechanomyography on the forehead using an array of textile pressure sensors that is not dependent on specific placement or attachment to the skin. They further write, “We investigate how pressure-sensitive smart textiles, in the form of a headband, can detect changes in facial expressions that are indicative of emotions and cognitive activities.”.


Picture Credits– The Expressure System

Speech Emotion Recognition/Detection[SER]

Specific changes are observed in our pitch, frequency, speech speed, rhythm, and voice quality with changes in our emotions.Recently, a high amount of interest is being shown to speech emotion recognition (SER) due to the expeditious progress in affective computing and human-computer interaction. This technology is not only thriving in the English language but also in other languages as well. For instance, researchers have come up with an Arabic Speech Emotion Recognition System with a database including utterances from 3 male and 3 female professionalsshowcasing four emotions: angry, happy, sad,and neutral.

Similarly, a Chinese patent (number- CN106782602B) was recently published which provided a method based on LSTM and CNN for Speech Emotion Recognition using Deep Neural Networks.On similar lines, a paper by D. Karthika Renuka et al.aims to establish an effective model-based speech emotion recognition system using deep learning techniques(e.g. RNNwith LSTM) on English and German datasets.

Towards a flawless Emotion Detection AI

The performance of any Artificial Intelligence technique is directly proportional to the amount of data that is being fed to the AI system. Collecting data to enhance Emotion Recognition can be tough, time-consuming, and expensive for many start-ups and small-scale research labs, thus, giving high-tech, rich companies an advantage over others. To solve this problem, synthetic data is being created by a technique called Generative Adversarial Nets (GANs). Synthetic data is any kind of data that copies real-life pictures. In a layman language, synthetic data provides pictures of people who do not even exist. This also prevents such AI companies from infringing other people’s privacy and getting entangled in complex lawsuits. One of the platforms creating synthetic data to train AI is TAWNY. See the picture below for reference.

Emotion detection and recognition technology is a market that’s predicted to grow to $65 billion by 2023.Emotion Recognition AI has madea lot of progress in the past decades and the credit mostly goes to the advent of deep neural networks. On one hand,the current programs of Emotion Recognition AI are very accurate. Some studies are accurate upto 99 percent when it comes to detecting the 6 basic emotions. However, on the other hand, it fails when tested over more subtle real-life expressions and emotions.

While the accuracy of current Emotion Detection AI is highly debatable, we believe that the use of emotion recognition to engage with important aspects of human life is still in its infancy.AI is currently trained rigorously on a maximum of 7 emotions. Current research shows that categories of emotions for humans are numerous and much fuzzier than what conventional theories mean. Therefore, there is a need to teach our systems a much wider range of emotions than what is being currently done. Maybe then, shall we see a flawless emotion recognition.