Beyond Verbal’s voice-driven Emotions Analytics Technology Positively Impacts Customer Satisfaction in Real-Time Communication

A current trend in the technology space and business media is the conversation surrounding Artificial Intelligence (AI), and the discussion about whether it will soon be replacing real human interaction. However, one aspect that is lacking from the discussion is a dialogue on how AI can dramatically improve business interactions, specifically in terms of client and customer relations. With this in mind, TMForum has developed within its Catalyst Program one such catalyst that focuses on Sentimental Apps; a rapid fire, member-driven, proof-of-concept project in which global enterprises try to create innovative solutions to address industry challenges. The catalyst was comprised of five companies: Bell Canada, Amdocs, Microsoft, Beyond Verbal and CallVU. All were focused on finding a way to increase customer satisfaction, increase revenue, and improve call resolution while reducing customer churn and cost to serve.


In order to provide a more personalized and contextual customer engagement, the catalyst demonstrated the value of leveraging cognitive computing technologies and Emotions Analytics in voice & text-based customer support. In order to achieve these objectives, the five companies collaborated to build a customer service representative dashboard that displays detected customer emotion (powered by Beyond Verbal) and customer intent, in order to drive the next best action and offer the customer the best service possible across assisted and unassisted channels.

Beyond Verbal contributed to the catalyst with its voice-driven Emotions Analytics, as its technology analyzes vocal intonations to detect speaker’s mood in real-time. As a company, its main task was to increase customer satisfaction and improve call resolution through the enablement of understanding the customer’s emotional state in real-time, while simultaneously empowering emotion-aware next best actions to achieve the best customer interaction outcome.

Beyond Verbal’s ability to extract, decode, and measure human moods in real-time as the call is taking place, introduces a new dimension of emotional understanding called Emotions Analytics. An interdisciplinary team of physics, operations research, decision making and neuropsychology research, scientists conducted more than 20 years of work that included over 3 million voice
samples in more than 40 different languages.


The key takeaway from the catalyst was that customer experience was enhanced by the Artificial Intelligence and vocal analytics which were used to inform and positively impact the conversation flow. The convergence of sentiment within the context of voice interactions will be pivotal in the future of customer service, especially for agent guidance and human assisted interactions.

About Beyond Verbal

Since its launch in 2012, Beyond Verbal has been using voice-driven emotions AI to dramatically change the way we can detect emotions and reveal health conditions. The only input needed is the human voice, making this technology non-intrusive, passive and cost effective. Beyond Verbal’s technology has been developed based on ongoing research into the science of emotions that started in 1995. By combining the company’s patented technology with its proprietary machine learning-based algorithms and AI, Beyond Verbal is focusing on enabling devices to understand our emotions and health.

Download proof of concept here