Emotion artificial intelligence is here!

11th January 2018
Emotion artificial intelligence is here!
Thanks to AI your smart fridge will tell how you feel and suggest foods that match your emotion.

Gartner Says Artificial Intelligence Is a Game Changer for Personal Devices& will drive the most compelling user experiences at home and work
Bangalore, January 11, 2018: Stand by for another buzzword: Emotional artificial intelligence (AI). These are personal devices that  will know more about an individual's emotional state than his or her own family. Such systems are becoming so sophisticated that that by 2022,  such AI systems  will  generate multiple disruptive forces that will  reshape  the way we interact with personal technologies suggests . Gartner.
"Emotion AI systems and affective computing are allowing everyday objects to detect, analyze, process and respond to people's emotional states and moods to provide better context and a more personalized experience," says Roberta Cozza, research director at Gartner. "To remain relevant, technology vendors must integrate AI into every aspect of their devices, or face marginalization."
The current wave of emotion AI systems is being driven by the proliferation of virtual personal assistants (VPAs) and other AI-based technology for conversational systems. As a second wave emerges, AI technology will add value to more and more customer experience scenarios, including educational software, video games, diagnostic software, athletic and health performance, and the autonomous car.
----------------------------------------------------------------------------------------------------
For a few days, see a video on Mindree Chairman Krishnakumar Natarajan
explain how emotion and AI meet in our Tech video section
------------------------------------------------------------------------------------------------------
 "Prototypes and commercial products already exist and adding emotional context by analyzing data points from facial expressions, voice intonation and behavioral patterns will significantly enhance the user experience," adds Ms. Cozza. "Beyond smartphones and connected home devices, wearables and connected vehicles will collect, analyze and process users' emotional data via computer vision, audio or sensors capturing behavioral data to adapt or respond to a user's wants and needs."
Other personal device predictions from Gartner:
|By 2021, 10 percent of wearable users will have changed lifestyles, and thereby extend their life spans by an average of six months.
As AI emotion systems evolve there is huge potential for specialized devices, such as medical wristbands, which can anticipate life-threatening conditions and facilitate an early response system. At the same time special apps are also being developed for diagnostic and therapy services that will help to recognize conditions such as depression or help children with autism.
 "Even a basic wearable device could have a positive impact on the wearer's health," suggests  Annette Zimmermann, research vice president at Gartner. "We are seeing growing numbers of users actively changing their behavior for the better with the adoption of a wearable device. Not only can this have beneficial influence on the amount of exercise they do but there is evidence that one or two out of 10 smart watch and fitness band users discover a condition such as sleep apnea or cardiac arrhythmia through wearing the device."
 By 2020, 60 percent of personal technology device vendors will use third-party AI cloud services to enhance functionality and services.
Cloud-based AI technologies are driving compelling user experiences on a variety of connected devices. Cloud offerings from the big tech players, such as Google, Microsoft, Amazon, Tencent, Baidu and IBM, are starting to proliferate due to their attractive cost model, easy-to-use integration and potential to create complex services. A major catalyst for device vendors to use cloud AI services is the increased usage of VPAs and natural-language technologies, while the adoption of VPA-based, screenless devices such as Amazon Echo and Google Home is also on the rise, further increasing usage of cloud AI services.
 "We are starting to see adoption of these services from high-profile vendors that are using them to widen their reach," says Anthony Mullen, research director at Gartner. "Fitbit uses Alexa Skills to make user stats and functionality available through VPA speakers just as Netflix uses Actions for Google Assistant to voice control its service. Ultimately, vendors will compete on the best user experience and the smartness of their products, not the technology behind it.
Through 2022, security technology combining machine learning, biometrics and user behavior will reduce passwords to account for less than 10 percent of all digital authentications.
Password-based simple authentication is becoming less and less effective for personal devices. Even today's popular biometric technology — fingerprint authentication — is only around 75 percent successful due to contaminants such as dirt and sweat.
"Users need more convenient and accurate options for unlocking their devices," says CK Lu, research director at Gartner. "Security technologies that combine machine learning, biometrics and user behavior will become necessary to improve ease of use, self-service and frictionless authentications. Within the next five years new security technology will recognize the user, prevent fraud and detect automation threats such as malware, remote access trojans and malicious bots."
Relevant report "Predicts 2018: Personal Devices",  part of the Gartner Special  Report “Predicts 2018: Stimulate Creativity to Generate Success,” a collection of research that focuses on predictions to help support the goals of capturing value from new opportunities and overcoming threats.

EMOTION AI WILL PERSONALIZE INTERACTIONS
Thanks to AI your smart fridge will tell how you feel and suggest foods that match your emotion.
Have you ever wondered whether one day technology will be able to identify emotions, just as humans can? What if your smart fridge could tell how you feel and suggest foods that match your emotion?
Unrealistic? Inconceivable? No. Artificial intelligence (AI) and affective computing are starting to make this possible. Devices enriched with AI, depth-sensing and neurolinguistic-programming technologies are starting to process, analyze and respond to human emotions.
“In the future, more and more smart devices will be able to capture human emotions and moods in relation to certain data and facts, and to analyze situations accordingly,” said Annette Zimmermann, research vice president at Gartner.
An Emotion-Sensing Future Approaches
Emotion-sensing systems will appear in devices as a result of the rise of intelligent agents, such as virtual assistants. Current examples of intelligent agents include Apple’s Siri, Microsoft’s Cortana and Google Assistant. They use the technological approaches of natural-language processing and natural-language understanding, but they don’t currently perceive human emotions. Artificial emotional intelligence (“emotion AI”) will change that. The next steps for these systems are to understand and respond to users’ emotional states, and to appear more human-like, in order to enable more comfortable and natural interaction with users.
An intelligent agent can be anything that can perceive its environment through sensors and act on that perception through actuators. Personal assistance robots (PARs), such as Qihan Technology’s Sanbot and SoftBank Robotics’ Pepper, are being “humanized” by training them to distinguish between, and react to, humans’ varying emotional states. The aim is for PARs to respond with body language and verbal responses appropriate to the emotions of the humans they interact with. If, for example, Pepper detects that someone is disappointed with an interaction, the intention is that it will respond apologetically.

Emotion AI Is Already Here
Future smart devices will be better at analyzing and responding to users’ emotions, thanks to AI systems that use deep-learning technology to measure the facial and verbal expression of emotion. These systems will play an increasingly important role in how humans interact with machines.
The first steps are already being taken. The video game “Nevermind,” for example, uses “emotion-based biofeedback” technology from Affectiva to detect the player’s mood and adjust its levels and difficulty accordingly. Oliver is playing his latest game console, “The Nevermind Game”. He’s been playing for 20 minutes and the further he gets into the game, the darker the mood and the more difficult the logic puzzles become. The thriller game is sensing Oliver’s anxiety as well as when he relaxes, and adjusts the levels based on his mood.In another field, in-car systems are emerging that adapt the responsiveness of braking systems to the driver’s perceived level of anxiety. Jeanne is already having a stressful morning by taking the kids to school after missing the bus, and she is on her way to the doctor as Clara, her new-born daughter, is unwell. She is short-tempered and agitated at the wheel. The car is detecting her anxious mood, and as she approaches a busy cross-road, makes the breaks more responsive to avoid any brutal stop.
In these cases, both video game and car are equipped with visual sensors and AI-based emotion-tracking software to enable real-time emotion tracking.
The Healthcare and Automotive Industries Are Driving Adoption of Emotion AI
Organizations in the automotive and healthcare industries are prominent among those evaluating whether, and how far, to adopt emotion-sensing features.
As the example above shows, car manufacturers are exploring the implementation of in-car emotion detection systems. “These systems will detect the driver’s moods and be aware of their emotions, which in return, could improve road safety by managing the driver’s anger, frustration, drowsiness and anxiety,” said Ms. Zimmermann.
In the healthcare arena, emotion-sensing wearables could potentially monitor the mental health of patients 24/7, and alert doctors and caregivers instantly, if necessary. They could also help isolated elderly people and children to monitor their mental health. These devices will allow doctors and caregivers to monitor patterns of mental health, and to decide when and how to communicate with people in their care.
Current platforms for detecting and responding to emotions are mainly proprietary and tailored for a few isolated use cases. It’s also used by many global brands over the past years for product and brand perception studies. “We can expect technology and media giants to team up and enhance their capabilities in the next two years, and to offer tools that will change lives for the better,” said Ms. Zimmermann.
The full report:  Market Trends: How AI and Affective Computing Deliver More Personalized Interactions With Devices by Annette Zimmermann. This research is part of the Gartner Trend Insight Report “IoT’s Challenges and Opportunities in 2017,” a collection of research focused on the key technical and business challenges that must be overcome in order for IoT to fulfill its promise.