One surprising pattern that has emerged as firms experiment with integrating AI everywhere is their reliance on AI to help their numerous newly acquired bots comprehend human emotion.
The new Enterprise Saas Emerging Tech Research report from PitchBook identifies this field as “emotion AI” and projects that this technology will grow soon.
The reasoning is as follows: If companies start using AI assistants for executives and employees, and deploy AI chatbots as front-line salespeople and customer service representatives, how can an AI effectively perform its role if it can’t distinguish between an irritated “What do you mean by that?” and a puzzled “What do you mean by that?
Sentiment analysis is a pre-AI technology that strives to extract human emotion from text-based interactions, especially on social media. Emotion AI is said to be its more advanced brother. Emotion artificial intelligence (AI) is what you may refer to as multimodal; it uses machine learning, psychology, and sensors for visual, auditory, and other inputs to identify human emotion during an encounter.
Prominent AI cloud providers provide services, such as Amazon Web Services’ Rekognition service or Microsoft Azure cognitive services’ Emotion API, that allow developers to use emotion AI capabilities. (Over the years, there has been some debate around the latter.)
Even though emotion AI is not new—it is even available as a cloud service—the unexpected increase in the number of bots in the workforce suggests that emotion AI has a brighter future in business than it has ever had, according to PitchBook.
“Emotion AI promises to enable more human-like interpretations and responses with the proliferation of AI assistants and fully automated human-machine interactions,” reports Derek Hernandez, senior analyst of emerging technologies at PitchBook.
“A key component of emotion AI’s hardware side are cameras and microphones. These may be found separately at a physical location or on a phone or laptop. Beyond these gadgets, wearable hardware will probably offer another way to use emotion AI, Hernandez adds. (So this could be the reason if the chatbot for customer support requests access to the camera.)
Consequently, an increasing number of startups are being established to achieve that goal. According to PitchBook estimates, these include Uniphore (which has funded $610 million in total, of which $400 million was led by NEA in 2022), as well as MorphCast, Voicesense, Superceed, Siena AI, audEERING, and Opsis, all of which have raised modest sums from various VCs.
Emotion AI is, of course, a very Silicon Valley approach: apply technology to address an issue arising from the application of technology to humans. But even if the majority of AI bots do eventually develop automated empathy, this method is not certain to be effective.
Actually, researchers threw a wrench in the notion the last time emotion AI became a big topic in Silicon Valley, which was around the year 2019 when a large portion of the AI/ML community was still concentrating on computer vision rather than on generative language and art. A group of academics concluded that facial movements cannot accurately predict human mood in that year after publishing a meta-review of previous data. To put it another way, the notion that we can train an AI to recognize human emotions by having it imitate how people attempt to do so (by observing faces, body language, and voice intonation) is a little erroneous.
What Silicon Valley is currently constructing at breakneck speed is an AI-everywhere future, which is all of this. These AI bots will either not be very competent at any work that requires that competence, or they will try to comprehend emotions to perform jobs like customer service, sales, HR, and all the other activities humans hope to assign them. Perhaps the workplace of 2023 will be dominated by AI chatbots similar to Siri. Who’s to decide which is worse, real-time guessing at everyone’s feelings during meetings or a management-mandated bot?
This idea may also be halted by AI-related laws, such as the AI Act of the European Union, which forbid the use of computer-vision emotion detection systems for particular uses like instruction. (Another state statute that prohibits the gathering of biometric readings without authority is the BIPA in Illinois.)