The Human Side of AI: Affective Computing
Learn how machine learning can be used to tailor experiences based on emotional and physical needs.
Hi, my name is Daria Loi. I'm a principal engineer at Intel Labs. Today, I'll be talking to you about the human side of artificial intelligence, affective computing. Stay with me to learn more.
Affective computing is a capability by which devices understand human emotions and use that understanding to take actions. So what can be done with emotional data? One thing affective computing can do is make us aware of our emotional state, helping us take better decisions.
Affective computing can also help us help others, or help machines make decisions to enrich our lives. There is another exciting use for emotional data, machine learning. This is where data is collected, so the machine refines its understanding to ultimately better personalize your experiences.
There are many domains where affective computing can help. To get you inspired, here are a few exciting areas. Imagine devices that offer personalized coaching based on your emotion, preferences, and schedule. Imagine if your environments where you live and interact could personalize your experience based on how you feel in that moment. Imagine being able to provide superior caregiving to [the] elderly, children, and people with limited abilities.
Now that we have covered some basics, let's talk more about how this technology works. Every day, we are surrounded by many devices with sensors capable of monitoring and understanding many personal features, like your heart rate, your face, posture, voice, or location. These senses can describe your features through classifiers, analysis, and inference.
By fusing, classifying, and filtering your features, the system develops a high-level classifier, basically a description of what you're feeling in a given moment. For instance, happy, or irritated, or unengaged. Funny, isn't it, how much work is needed for a machine to do what we humans do everyday, often instantly.
As it turns out, the emotional interpretation we humans engage in daily aren't as easy for a machine. Let me tell you why. First, turns out that emotions are hard to quantify and label. They vary across people and cultures, and are hard to capture in isolation.
Second, emotions are multimodal and demand sensor fusion. For instance, if a machine used the face recognition alone to detect happiness, it wouldn't be able to tell the difference between a sarcastic or happy smile.
Third, acting appropriately through emotional data varies contextually and has implications. Affective computing is a complex field, because emotions are hard to label, are multimodal, and very bizarre context. Yet, affective computing is crucial. Let me tell you why.
Technology has evolved immensely, and because of that, we have more access to our data. Not only do we want to learn more, but companies want to learn more about us. It is, indeed, predicted that emotional analytics will disrupt the industry and create new business opportunity in the next five to ten years.
Also, to properly interact with humans, artificial intelligence must understand their state and learn from their feedback. Affective computing is the human side of AI. It is crucial for AI to succeed, and will impact many fields. Affective computing requires a multimodal approach, sensor fusion, and appropriate datasets.
Finally, only a clear understanding of people's attitudes and thresholds related to artificial intelligence will help us design a meaningful affective system that we will want in our lives.
Thanks for watching. You can access further resources on affective computing in the links below. Don't forget to like this video, and subscribe to the Intel® Software YouTube* [sic] channel, and check us out on Facebook*.
Product and Performance Information
Performance varies by use, configuration and other factors. Learn more at www.Intel.com/PerformanceIndex.