Advertising

Can AI make significant progress in understanding emotion? Hume receives $50M to support this belief

blankCan AI Make Significant Progress in Understanding Emotion? Hume Receives $50M to Support this Belief

A new startup called Hume AI has recently made waves in the tech world by announcing that it has raised $50 million in a Series B round of funding. Led by EQT Ventures and with participation from Union Square Ventures, Nat Friedman & Daniel Gross, Metaplanet, Northwell Holdings, Comcast Ventures, and LG Technology Ventures, Hume AI aims to revolutionize the field of artificial intelligence by focusing on understanding and responding to human emotion.

Founded by CEO Alan Cowen, a former researcher at Google DeepMind, Hume AI sets itself apart from other AI model providers and startups by creating an AI assistant and an API that understands human emotion. Unlike text-based chatbots, Hume AI’s assistant also uses voice conversations as its interface, analyzing the user’s intonation, pitch, pauses, and other vocal features to detect and react appropriately to their emotions.

Understanding human emotion is a complex task for an AI assistant. Hume AI goes beyond simple emotions like happiness, sadness, anger, and fear. The startup lists 53 different emotions that its AI is capable of detecting, including admiration, adoration, amusement, anxiety, awe, boredom, calmness, confusion, and many more. By developing AI models that can understand and express these nuanced emotions, Hume AI aims to provide better AI experiences for users.

According to Cowen, vocal cues are essential for emotional intelligence. Studies show that vocal modulations and the tune, rhythm, and timbre of speech convey more about our preferences and intentions than language alone. By analyzing these vocal cues, Hume AI’s AI models can predict human preferences and outcomes more accurately, leading to more effective interactions with users.

Hume AI’s empathic voice interface (EVI) detects emotions from vocal changes by training its AI model on controlled experimental data from hundreds of thousands of people worldwide. The models were trained on human intensity ratings of large-scale emotional expression data, including vocal bursts and facial expressions. Hume AI’s deep neural networks were trained on audio and photo data collected from participants in these studies.

The startup offers various APIs that enterprise customers can use to build their own applications. The Expression Measurement API allows the understanding of facial expressions, vocal bursts, and emotional language. The Empathic Voice Interface API provides a voice assistant that can detect and respond to user emotions. Additionally, the Custom Models API allows users to train their own Hume AI models tailored to their unique datasets.

While the technology developed by Hume AI is impressive, ethical questions arise. The company has established The Hume Initiative, a non-profit organization that brings together social scientists, ethicists, cyberlaw experts, and AI researchers to maintain concrete guidelines for the ethical use of empathic AI. These guidelines aim to prevent the exploitation and manipulation of users’ emotions and ensure that AI serves objectives aligned with well-being.

Despite these concerns, Hume AI’s EVI demo has received rave reviews from tech workers, entrepreneurs, and early adopters. Many have expressed admiration for the naturalistic and advanced capabilities of the technology. Hume AI may have set a new standard in human-like interactivity, intonation, and speaking qualities for AI assistants.

While it remains to be seen if Hume AI will attract potential partnerships or acquisitions from larger entities such as Amazon or Microsoft, its groundbreaking technology has already caught the attention of industry leaders. With its focus on understanding and responding to human emotion, Hume AI is at the forefront of the AI revolution, aiming to provide more realistic and satisfying customer experiences across various industries.