Emotion Detection with AI in Security
Identification technologies are more and more used to detect specific human behavior. An example of that is the combination of ANPR with traffic incident detection. And most of us will be aware that video surveillance cameras with AI are capable of classification, an algorithm that is used to predict what category data belongs to. Camera suppliers have demonstrated that their cameras are capable of classification of humans in ethnical and age-related categories in questionable and fun-intended examples. A next step is the detection or even prediction of human behavior. Automated emotion recognition, or emotional AI, is the process of identifying human emotion, most typically from facial expressions as well as from verbal expressions, mostly using cameras that are used for facial and voice recognition.
The general press is currently mostly alerted by emotion detection or, in other words, affect recognition using AI powered video cameras, either in dedicated security cameras or built-into smartphones and tablets or even webcams. The Evening Standard for example has covered the activity of a London-based start-up Sensing Feeling that says its system “has the potential to predict people’s behavior by scanning faces and body language to assess their mood. The footage is checked in real-time against pre-programmed images for signs of anger, contempt, fear, disgust, happiness, sadness and surprise”. The article mentions that the systems “is already in use at a private university outside the UK which licensed the system to replace student satisfaction feedback forms after lectures”. The company claims that their systems also could benefit safety and security applications: “help operators of safety-critical environments perform real-time detection and prediction high-risk human behaviours based upon interactions, movements and flows of teams and crowds”.
A leading company in this field may be MIT-spring-off Affectiva, lead by co-founder, CEO Dr. Rana el Kaliouby. The company website states in capitals: “Affective human perception AI analyzes complex human states”. The company offers solutions in several verticals, like automotive and market research. iMotions is an US-based company with several branches and also Affecta partner. They use multiple sensor technologies in their iMotions Platform to analyze human behavior. They also state on their website that the platform supports recognition of facial expressions using “facial coding”: “Computer-based facial expression analysis mimics our human coding skills quite impressively as it captures raw, unfiltered emotional responses towards any type of emotionally engaging content. These expressed emotional states are detected in real time using fully automated computer algorithms that record facial expressions via webcam.” Other companies also claim to have progressed in the field of emotion recognition: Swedish company Visage claim that their lightweight software, that can be integrated on all sorts of devices, is capable of collecting detailed data about people’s gender, age and emotions
in order to “build engaging experiences”.
Technological innovation monitoring website Springwise already reported in June 2018 that in China facial recognition cameras from Hikvision were used to monitor the emotions of students in class so teachers better understand the emotions that students are experiencing.
The Financial Times reported back in November 2019 that “Emotion recognition is China’s new surveillance craze“. Copyright protects us from quoting it, but the essence is that, although in use in real-life applications, the technology is still considered to be in an early phase and is still lacking accuracy. Gartner follows AI developments, including emotion detection, closely and claims “By 2022, 10% of personal devices will have emotion AI capabilities“. Surprisingly they do not mention security as a typical application area for the technology. They do foresee that in 2020 payment with facial recognition will be a trending technology. It will be interesting to see how that will affect the debate about facial recognition as a security technology.
It is clear that this AI field is progressing rapidly. We have seen in other areas that the use of deep learning technology and machine learning can help drive the performance and accuracy. Scientists have, for example, already been capable of developing a neural network that accurately decodes images into 11 distinct emotion categories. This network, named EmoNet, “was tested on 24,634 images from 400 videos not included in a previous training set. EmoNet accurately decoded normative human ratings of emotion categories, providing support for prediction”.
It does not require a great deal of imagination that this technology could be of benefit in security applications. Identifying black-listed individuals in groups and in addition learning about their current state of mind might be helpful in immediate security threat assessments. Detecting a state of fear of an individual in random video footage may help to quickly detect security incidents in streets.
Have you tested this technology in a security application? Or are you maybe already actively using it? Please share your ideas and experiences below.