How artificial intelligence is being used to capture, interpret and respond to human emotions and moods.
This article has been updated from the original, published on June 17, 2017, to reflect new events and conditions and/or updated research.
“By 2022, your personal device will know more about your emotional state than your own family,” says Annette Zimmermann, research vice president at Gartner. This assertion might seem far-fetched to some. But the products showcased at CES 2018 demonstrate that emotional artificial intelligence (emotion AI) can make this prediction a reality.
This technology can be used to create more personalized user experiences, such as a smart fridge that interprets how you feel
Emotion AI, also known as affective computing, enables everyday objects to detect, analyze, process and respond to people’s emotional states and moods — from happiness and love to fear and shame. This technology can be used to create more personalized user experiences, such as a smart fridge that interprets how you feel and then suggests food to match those feelings.
“In the future, more and more smart devices will be able to capture human emotions and moods in relation to certain data and facts, and to analyze situations accordingly,” adds Zimmermann. “Technology strategic planners can take advantage this tech to build and market the device portfolio of the future.”
Embed in virtual personal assistants
Although emotion AI capabilities exist, they are not yet widespread. A natural place for them to gain traction is in conversation systems — technology used to converse with humans — due to the popularity of virtual personal assistants (VPAs) such as Apple’s Siri, Microsoft’s Cortana and Google Assistant.
Today VPAs use natural-language processing and natural-language understanding to process verbal commands and questions. But they lack the contextual information needed to understand and respond to users’ emotional states. Adding emotion-sensing capabilities will enable VPAs to analyze data points from facial expressions, voice intonation and behavioral patterns, significantly enhancing the user experience and creating more comfortable and natural user interactions.
IBM and startups such as Emoshape are developing techniques to add human-like qualities to robotic systems
Personal assistant robots (PARs) are also prime candidates for developing emotion AI. Many already contain some human characteristics, which can be expanded upon to create PARs that can adapt to different emotional contexts and people. The more interactions a PAR has with a specific person, the more it will develop a personality.
Some of this work is currently underway. Vendors such as IBM and startups such as Emoshape are developing techniques to add human-like qualities to robotic systems. Qihan Technology’s Sanbot and SoftBank Robotics’ Pepper train their PARs to distinguish between, and react to, humans’ varying emotional states. If, for example, a PAR detects disappointment in an interaction, it will respond apologetically.
Bring value to other customer experience scenarios
The promise of emotional AI is not too far into the future for other frequently used consumer devices and technology, including educational and diagnostic software, video games and the autonomous car. Each is currently under development or in a pilot phase.
Visual sensors and AI-based, emotion-tracking software are used to enable real-time emotion analysis
The video game Nevermind, for example, uses emotion-based biofeedback technology from Affectiva to detect a player’s mood and adjusts game levels and difficulty accordingly. The more frightened the player, the harder the game becomes. Conversely, the more relaxed a player, the more forgiving the game.
There are also in-car systems able to adapt to the responsiveness of a car’s brakes based on the driver’s perceived level of anxiety. In both cases, visual sensors and AI-based, emotion-tracking software are used to enable real-time emotion analysis.
In 2018, we will likely see more of this emotion-sensing technology realized.
Drive emotion AI adoption in healthcare and automotive
Organizations in the automotive and healthcare industries are prominent among those evaluating whether, and how far, to adopt emotion-sensing features.
As the previous examples shows, car manufacturers are exploring the implementation of in-car emotion detection systems. “These systems will detect the driver’s moods and be aware of their emotions, which in return, could improve road safety by managing the driver’s anger, frustration, drowsiness and anxiety,” explains Zimmermann.
Read more: 4 Areas Driving Autonomous Vehicle Adoption
Emotion-sensing wearables could potentially monitor the mental health of patients 24/7
In the healthcare arena, emotion-sensing wearables could potentially monitor the mental health of patients 24/7, and alert doctors and caregivers instantly, if necessary. They could also help isolated elderly people and children monitor their mental health. And these devices will allow doctors and caregivers to monitor patterns of mental health, and decide when and how to communicate with people in their care.
Current platforms for detecting and responding to emotions are mainly proprietary and tailored for a few isolated use cases. They have also been used by many global brands over the past years for product and brand perception studies.
“We can expect technology and media giants to team up and enhance their capabilities in the next two years, and to offer tools that will change lives for the better,” says Zimmermann.
Gartner clients can read more in the full report Market Trends: How AI and Affective Computing Deliver More Personalized Interactions With Devices by Annette Zimmermann. This research is part of the Gartner Trend Insight Report “IoT’s Challenges and Opportunities in 2017,” a collection of research focused on the key technical and business challenges that must be overcome in order for IoT to fulfill its promise.