Beyond Verbal is a platform that analyzes vocal intonations in order to determine how speakers feel about what they’re saying.
While startups such as Soma Analytics have given smartphone users the power to monitor their mood through data, our latest spotting aims to make emotion detection completely automatic. Beyond Verbal is a platform that analyzes vocal intonations in order to determine how speakers feel about what they’re saying.
According to the company, its software is the result of 18 years of research into what it calls ‘Emotions Analytics’, and is inspired by figures that suggest 90 percent of the meaning we communicate is not through the words we choose. Developed with its team of “physics, neuropsychology and decision-making experts”, Beyond Verbal’s system records voice and breaks speech down into 10 to 15-second clips, offering analysis in real-time. Developers can then use this technology as an API for larger applications. Beyond Verbal gives online dating, political truthseeking and marketing as examples, and it’s easy to see how such technology could also help those with social behavioural problems such as autism. The video below offers more information about the platform:
The company recently won USD 2.8 million seed funding from angel investors and has already launched its own Moodies app to demonstrate the capabilities of its emotion recognition engine. With voice and face recognition becoming increasingly powerful fields in mobile tech, could we see more efforts to get computers to detect our more human characteristics?
To read the full article please follow the link: