Hume AI
In 1739, David Hume put forth the view that emotions drive choice and well-being. Recognizing the need to map out the emotions that animate thought and action, Hume also proposed a taxonomy of over 16 emotional states, but lacked scientific evidence.
In 1872, Charles Darwin surveyed human emotion and presented his findings in his third major work, The Expression of the Emotions in Man and Animals. Charles Darwin described similarities and differences in over 20 facial, bodily, and vocal expressions across species, cultures, and stages of life. He lacked statistical methods to test his hypotheses about human emotion. But 150 years later, studies are confirming many of Darwin’s observation.
In 1969, American psychologist, Paul Ekman documented six facial expressions that are universally recognized - anger, happiness, sadness, disgust, fear and surprise. By focusing on a narrow set of behaviors, Ekman was able to use the statistical methods available to him to confirm some of Darwin’s ideas.
Researchers at Hume AI explore emotions through data-driven approaches, using computational tools and analyzing extensive datasets to grasp the range of human feelings. They've collected millions of reactions to videos, music, and art, examined brain mechanisms behind emotions, analyzed expressions in ancient sculptures, and applied deep learning to global video expressions. Their work has identified over 30 dimensions of emotion.
With a goal to achieve a voice AI experience that can be fully personalized and making voice the primary way people want to interact with AI, Hume introduced its third-generation speech-language model, EVI 3. As a speech-language model, where the same intelligence handles transcription, language, and speech, EVI 3 brings more expressiveness, realism, and emotional understanding to voice AI. And instead of being limited to a handful of speakers, EVI 3 can speak with any voice and personality you create with a prompt.
Comments
Post a Comment