Empathy has always played a pivotal role in human relations, it is the emotion that helps us confide in our closest friends and family. It is the fundamental building block in social interaction, which drives collectivism and progression. Empathy has been considered a trait unique to humans, due to the complex interplay of reasoning, and abstract thought. However, this is being challenged by recent technology, in which machines are being taught empathy. The technology is learning to deduce mood and emotion, based on facial expressions and vocal tones, similar to human cues. A pioneer in this industry is the Massachusetts-based Affectiva, utilized by 32 of the Fortune 100 companies and expanding its application deep into financial and telecommunication industries by developing empathetic robots.
What Is Affectiva?
Affectiva is a tech company, founded in Massachusetts, United States in 2009, which is dominating the current market for empathetic AI. The company originally started in the medical field, creating a wristwatch called Q sensor, which monitors epileptic patients to prevent heart failures.
However, the company soon became interested in emotional monitoring, attempting to bridge the gap between human and machine. Affectiva CEO Rana el Kaliouby stated “All the nuances and all the richness of my feelings disappeared in cyberspace,” when addressing the motive for her startup. She pondered “What would it take to get our technologies and our devices to understand us, just in the same way?”.
Affectiva technology rationalizes and analyses facial and written cues, vocal undertones, and more signals, with aim of recognizing and responding to human emotions. The areas that AI emotional awareness that Affectiva builds upon are the integration of machine learning algorithms, natural language processing/understanding, and computer vision.
However, the most integral factor is deep learning, which holds the ability to analyze and update its existing database to become more refined with each new facial or vocal encounter. More faces and vocal signals help the technology understand the appropriate responses. Affectiva has analyzed 50 million data points, from videos, pictures of all human ages and ethnicities, making it the largest emotion data repository in the world.
“That’s allowed our system to learn the difference between a Japanese smile and a Brazilian smile”
“That’s allowed our system to learn the difference between a Japanese smile and a Brazilian smile”, Affectiva CEO Rana el Kaliouby stated in an interview. Geographically based differences are important to understand. For example, Hmong people (an Asian ethnic group in China and Southeast Asia) smile when embarrassed, which could easily be misconstrued.
More Humanlike Technologies
Computing interface technology has progressively become more humanlike and seamless, which can be observed in recent human voice technologies like ‘Siri’ or chatbots. The next step in this progression is understanding nuances in the user responses, not only to what was spoken or written but to the underlying emotion. Affectiva adds emotional intelligence to any interactive product. Richard Yonck, the founder of Intelligent Future Consulting, said “Emotional awareness is the next natural progression” in relation to interface intelligence.
Applications Of Affectiva
The scope of the Affectiva applications is large, and the founder and acting CEO, Rana el Kaliouby, said “imagine a future where every device has a little emotion chip and can read your emotions”.
Relating To Devices And Apps
Although critics and skeptics may question the extent to which technology and AI play in our lives, she follows by “The way we relate to our devices and apps, it’s becoming very relational”. Furthermore, as we interact more and more with technology, there is more want for an approachable and friendly interface.
“Humans with higher emotional intelligence are more likable, so why shouldn’t the same be true for devices?” she continues.
Increased User Communication Experience
Affectiva has blossomed in the telecommunications world since it increases the user communication experience. Many large companies such as Facebook, Apple, and Amazon, have adapted the technology to detect user vocal emotion and autonomous interacting accordingly.
Affectiva Technology Adopted By “Pepper”
A robot humanoid, going by the name “Pepper”, was created in 2014 by Softbank Robotics. It is a robot equipped with a microphone and camera, known for its sophisticated ability to converse with humans, specifically the ability to detect emotion and respond accordingly.
As of 2018, Pepper adapted the technology of Affectiva intelligence to enhance its emotional ability and make it into an empathetic robot. British multinational investment banks HSBC adopted and incorporated the technology, using Pepper in many of their branches for customer service.
Pepper is responsible for basic clientele assistance, such as opening bank accounts. Although it might be considered a gimmick, it does lay the foundations for what the future holds.
Affectiva’s Future In FinTech
Affectiva has application in the world of finance and has been used in many instances to build upon client relations. Facial analysis is being used to build behavioral profiles, which helps financial advisors to assist clients, based on detecting and managing client behaviors. It builds trust between humans and machines since it allows humans to converse with machines on a more complex level, such that a human would communicate with another human.
Ethical concerns have been raised, disputing that interfaces using Affectiva technology could be considered a privacy breach. The underlying concern of the claims was the emotional and speech detection based on cameras and/or microphones.
“AI Now Institute” has installed a ban on recognizing people’s faces in certain cases, such as using it to “impact people’s lives and access to opportunities,” which would be in relation to hiring decisions. Furthermore, biases and social injustices are dangerous when training the Affectiva, and another recognition technology, and avoiding these notions is not straightforward.
Understanding subtle nuances of emotion can be a subjective medium to understand, and our ability to do so has been the separating factor between machine and human. Machines learning empathy is essentially ‘humanizing machines’. It is a step that seamlessly closes the dissonance between AI and humans, which can have good and bad implications.