Improve your user engagement metrics with emotional analytics

Find out how new emotion analytics algorithms are bringing feeling, emotional connection, and engagement to modern apps.

The next big thing in AI, emotional intelligence, could give hospitals a competitive edge

Israel’s Artificial Intelligence Startups
The artificial intelligence industry is expected to be worth $59.8 billion by 2025, and the term AI has become ubiquitous worldwide; the frenzy of many tech enthusiasts, or the topic of discussion at a dinner table. But the hype actually lives up to its name. AI startups are flush with VC cash and even key corporate leaders are actively utilizing the technology to add value and gain a competitive edge.
Feared and praised by the world — tech mogul Elon Musk believes AI will be a catalyst for the next world war, and AI guru Andrew Ng compared AI to electricity’s transformative role in economic development — it is expected to fuel a technology revolution. And Israel, not surprisingly, is well placed among the world leaders to be a driving engine.
In an effort to track the state of AI in Israel, I researched and analyzed Israeli startups that are employing artificial intelligence technology. The infographic above maps over 430 Israeli startups using AI technology as a core part of their offering, ranging from early to late-stages of funded companies. The startups are separated into eight categories: Technologies, Sectors, Industrial, Automotive, Enterprise, Healthcare, Fintech, and Marketing, and the technology is being used for a myriad of purposes.
Insights
In the following post, I’ll dive deeper into analytics on Israel’s AI startup ecosystem. For now, here’s a top 10 summary of my findings:

  1. Funding for Israeli AI startups is heating up. For the 2017 year-to-date, Israeli AI startups have raised $837 million, which is already larger than the total period for 2016, and represents a fifteen-fold increase in the last five years.
  2. The number of startups using AI technology has nearly tripled in size since 2014, and based on a three year average growth rate, 95 new AI startups are founded each year in Israel. Almost 300 startups were founded between 2014 and today, representing over 60% of all Israeli AI startups.
  3. Most of the new AI startups founded since 2014 are in the Marketing and Enterprise sectors. The Enterprise AI market is valued at $31 billion and growing by 64% annually. By 2020, more than 40% of new IT expenditures are expected to go toward AI solutions. Last year, a survey found that 38% of enterprises are already using AI, growing to 62% by 2018.
  4. The majority of startups are business-facing — 85% are classified as B2B, while 15% are B2C.
  5. Among the AI technologies utilized, 57% use machine learning techniques, 15% use deep learning techniques, 7% use natural language processing, 4% use computer vision.
  6. The top 10 most active investors to date in the Israeli AI space are: Microsoft Accelerator, Office of the Chief Scientist of Israel, JVP, Nielsen Innovate, OurCrowd, Magma Venture Partners, UpWest Labs, Aleph, Glilot Capital Partners and Horizons Ventures.
  7. More than half of the active startups have raised $500 thousand or less in total funding, and the average age is 2.8 years since founding.
  8. The marketing, enterprise, healthcare and fintech sectors have the largest concentration of growth/later stage startups. Fraud and Insurance focused startups dominate the Fintech sector, representing a combined 63% of the total Fintech sector. For growth stage startups, the Enterprise sector has received the majority of funding ($760 million), represented mostly by cyber security startups.
  9. There has already been $1.0 billion in total exits, and the average exit deal size is $72 million. Based on the available data, Israeli AI startups are exiting at an average multiple of 8.2 times their total funding, ranging from a low of 2.0 to a high of 20.0 times. The time lag from founding to exit takes 5.4 years on average.
  10. Half of the exits occurred in the last two years, and half of the exits were Marketing related startups. Corporate giants like Apple, Microsoft and Google are actively competing to gobble up AI startups and attract the best talent.

The AI space in Israel is certainly growing and even leading the way in some fields of learning technologies. Stay tuned for a deeper dive into each sector as well as profiles on notable startups in the following posts.
Feel free to download and share the infographic. To download the excel file with the list of startups and their corresponding details, send an email to danielsinger27@gmail.com.

https://medium.com/@danielsinger27/the-artificial-intelligence-industry-is-expected-to-be-worth-59-8-965535c9958e

Improve your user engagement metrics with emotional analytics

Find out how new emotion analytics algorithms are bringing feeling, emotional connection, and engagement to modern apps.

The next big thing in AI, emotional intelligence, could give hospitals a competitive edge

Most app development focuses on what an application is supposed to do. However there is a growing interest in crafting applications that can respond to how we feel by weaving emotion-aware signals gleaned from faces, voices, and text. At the O’Reilly Design Conference in San Francisco, Pamela Pavliscak, CEO of SoundingBox, explored how applications that senses our emotions are already changing the way we relate to technology and other people.

Already cameras are being used to capture emotional expressions in faces, microphones can analyze emotional tone of conversations, and sentiment analysis techniques are used to make sense of how people are feeling in social media. In addition, new edge devices with sensors gather data about our heart beat, brain waves, and the electrical conductivity of skin to make even more nuanced assessments of emotions. This was once the realm of self-hackers and high budget marketing teams. But now it is starting to go mainstream. Pavliscak said, “With Feelings 2.0, we are told we will get a new wave of technology that can read emotions and this will lead to a more emotional connection.”

Apple for instance recently bought out Emotient, which analyzes emotional expression in the face. This could help bring emotion recognition capabilities to iOS. In addition, both Microsoft and IBM now offer a suite of emotion analytics capabilities as part of their cloud offerings.


Who is using emotional analytics?

A number of companies have developed APIs for recognizing emotions expressed in faces. These techniques are based on the research of Paul Ekman who identified 5-universal emotion patters expressed across all cultures, popularized in the movie Inside Out. The first generation of these tools used cameras in stores to anonymously analyze the emotional impact of new products. Now developers are starting to use facial expression analytics to improve game play.

Developers are also starting to incorporate emotional sensing into a new generation of physical devices. Microsoft did a research project called MoodWings which alerts users to stress through the flapping of butterfly wings. A group of MIT researchers developed the pplkpr app to analyze and suggest responses to people that stress you out. The app correlates changes in heart rate variability with data about email and social media interactions. These early implementation can be enlightening but not always useful. Pavliscak found for example that interactions with her husband made her the happiest and also the angriest.

These tools are still in their early stages. Developers and designers have a lot of problems to solve before emotional analytics techniques are integrated into compelling and engaging apps. Part of the problem lies in developing a richer vocabulary for describing emotions. Companies and app developers all want to create products that bring happiness, but what does this really mean. Pavliscak observed, “In English we don’t have that many words for happiness. In my own research of 1,000 people, what turned out to be happy was really complicated. It includes a lot of emotions.”

Some kinds of “happiness” and delight can drive people away in the wrong context. Pavliscak did one study on a retirement planning site that looked at the impact of whimsical mascots that annoyed seniors more concerned with making better financial decisions than laughing. In the long run, Pavliscak believes we might start weaving emotional intelligence into applications, but this will require richer models about how we feel in order to have a meaningful impact.

http://www.theserverside.com/feature/Emotional-analytics-a-new-approach-to-user-engagement

The next big thing in AI, emotional intelligence, could give hospitals a competitive edge

But some big questions need to be answered as tools like Siri and Alexa start playing a role in the patient journey, says one expert.

The next big thing in AI, emotional intelligence, could give hospitals a competitive edge

As Amazon’s Alexa makes “herself” comfortable in more and more homes, she and similar artificial intelligence technologies could soon be having an impact on hospitals.

AI-based virtual assistants are evolving quickly, and more and more effort is being put into making them emotionally intelligent – able to pick up on subtle cues in speech, inflection or gesture to assess a person’s mood and feelings.

The ways that could impact wellness and healthcare are intriguing. By reading into vocal tone, AI platforms could perhaps detect depression, or potentially even underlying chronic conditions such as heart disease.

[Also: Healthcare AI poised for explosive growth, big cost savings]

For example, a Tel Aviv-based startup called Beyond Verbal is working on analytics tools that could work with Alexa et al. to gain insight into behavioral and vocal patterns.

“In the not so far future, our aim is to add vocal biomarker analysis to our feature set enabling virtual private assistants to analyze your voice for specific health conditions,” said the company’s CEO Yuval Mor in June.

In the nearer term, hospitals looking to realize the benefits of AI and EI need to think hard about where and how they’ll deploy the technology as it continues to mature, said Anthony Chambers, director in the life sciences practice at Chicago-based consultancy West Monroe Partners.

[Also: As AI spreads through healthcare, ethical questions arise]

The use cases for AI in healthcare are many and varied. Voice-enabled virtual assistants can help clinicians access notes or let surgeons see safety checklists. They can help with staff handle coding and transcription chores. Smart deployments of the technology hold the potential for big gains in hospital efficiency.

“Hospitals have realized they’re sitting on mounds of data,” said Chambers. “The past few years, they’ve been starting to take the next steps with narrative science, natural language generation and other machine learning technologies to give them a competitive advantage. We’re seeing our clients make a lot of progress on identifying and predicting where efficiencies could be found in the patient care journey.”

Lots of hospitals are now using AI and machine learning to “predict where issues are: where they can the get higher throughput, where they can see more efficiency in their care management,” he said. “They can measure in real-time how they’re doing, how can they gauge capacity, where is the slack in the system.”

But in the years ahead it may be patients themselves who could be spending the most time with the AI platforms – and that’s where emotional intelligence begins to take on more importance.

“What gets really fascinating – we have yet to see it, but we’re seeing discussions of it – is potential uses around the quality of care,” said Chambers. “That remains an untapped potential where the promise of emotional intelligence, in combination with AI, could play out.”

Hospitals and pharmaceutical companies are starting to explore how the platforms could help clinical trial management, for example: “We already know of one client that is doing a proof of concept to support clinical trials, at the intersection point between provider and pharma,” he said.

Natural language processing tools could help with gathering data and predicting outcomes, “giving almost real-time feedback to the physician and the drug company at the same time,” said Chambers.

That’s especially useful given how stress clinical trials can be on the patient. Offering a less intrusive way to communicate results to both provider and drug company could be a boon.

“If we could use an interactive bot, where the patient then has a point of conversation via smartphone or something, that could be a game changer because of the challenge of clinical trials being so stressful on the population, and the expense of running the trials,” he said.

Chambers said he’s also seeing more and more providers starting to “dip their toes into automating the bookends of the patient journey” – intake and discharge.

“Being able to potentially monitor the intake with a human in the room, but also an Alexa-type unit listening to the conversation and also hearing the stress or anger or fear in a patient’s voice, that may throw up real-time prompts that the human can then put forward,” he said. “That use case has been kicked around, a way to support the intake process.

“Think about facial expressions, gestures, pace and tenor of speech,” he added. “If you factor those pieces into what a chatbot or robot or other interaction point with a human, that becomes an indicator or piece of data that artificial intelligence and big data algorithms could use to assess outcomes.”

As hospitals increasingly look to forward-leaning implementations such as these, there are some important questions CIOs and other IT professionals should be keeping in mind, said Chambers.

For those organizations looking to use AI and EI to help with customer experience or quality of care efforts, “I think the first question hospitals or clinicians are going to have to decide is how will it support that care journey,” he said.

Will it displace human interaction, or just augment it? And if it does displace it, where is it going to support the patient, and how are you going to use that communication?

“Because you really are changing the paradigm, potentially, of how you’re going to interact with the patient,” said Chambers. “Where on the patient journey do you see a need?”

Assuming those questions are ironed out and the implementation is complete, there another important thing to consider, he said: “What are you going to do with that data? It’s patient-level data, it’s real-time. Does that get then folded into an electronic health record? Do you tag it with social media? Does claims data use it? How does it get integrated?

The issue of privacy alone “is a little daunting,” said Chambers. “How do you manage a patient’s emotional quotient? I don’t know. These are problems we’re grappling with. But hospitals and healthcare companies have an opportunity to lead in this space.”

http://www.healthcareitnews.com/news/next-big-thing-ai-emotional-intelligence-could-give-hospitals-competitive-edge

Daily API RoundUp: Gavagai, JW Player, cdnjs, Beyond Verbal

Every day, the ProgrammableWeb team is busy, updating its three primary directories for APIs, clients (language-specific libraries or SDKs for consuming or providing APIs), and source code samples. If you have new APIs, clients, or source code examples to add to ProgrammableWeb’s directories, we offer forms (APIs, Clients, Source Code) for submitting them to our API research team. If there’s a listing in one of our directories that you’d like to claim as the owner, please contact us at editor@programmableweb.com.

Thirty more APIs have been added to the ProgrammableWeb directory in categories including Real Estate, Messaging, and Payments. Highlights today include the Beyond Verbal API for analyzing emotions in speech and the Plume API for retrieving air quality data. Here’s a rundown of the latest additions.

APIs

Beyond Verbal provides analysis of human emotions based on vocal intonations. The Beyond Verbal Emotions Analytics API programmatically understands and analyzes emotions for the developer’s voice-powered device, system, or application. This service can determine a recorded voice’s valence, which indicates positivity or negativity, arousal, which measures the speaker’s level of alertness or stimulation, temper and mood group (emtional state). The Beyond Verbal API is listed under the Sentiment category. See ProgrammableWeb‘s complete list of Sentiment APIs.

Demo of Beyond Verbal decoding emotions of Steve Jobs as he is interviewed. Video: YouTube Beyond Verbal

CapchainX provides a cryptocurrency and blockchain platform for managing equity and trades plus an API for third-party integration. The Capchain API and documentation is available to account holders. It is listed in the Blockchain category. See ProgrammableWeb‘s complete list of Blockchain APIs.

TargetEveryOne is a platform for creating, designong, distributing and analyzing mobile campaigns and landing pages. It is a content management system (CMS) to design digital ads which provides a drag and drop online solution for mobile campaigns. We have added four APIs to our Campaigns category for integrating with the TargetEveryOne platform.

TargetEveryOne Template System API let’s users build predefined templates for getting customers to engage with offers that can be an opt-in form on a webpage.

TargetEveryOne CRM System API manages contacts in the TargetEveryone system.

TargetEveryOne Distribution System API allows users to distribute a campaign to one or many contacts via SMS or Email.

TargetEveryOne Analytics and Statistics System API gives access the statistical data that a campaign generates.

Plume Labs is an environmental technology company with a platform for helping users understand air quality data. The platform is powered by Artificial Intelligence and offers the Plume API, which uses machine learning and atmospheric science to provide air quality data and hourly forecasts. Developers can use the API to get live and forecasted UV and pollution data including particulate matter (PM10 and PM2.5), nitrogen dioxide (NO2) and ozone (O3). The API is listed under the Environment category. See ProgrammableWeb‘s complete list of Environment APIs.

 

Get air quality data with Plume API
Image: Plume Labs

Elanders Print and Packaging provides printing, packaging and web-to-business services. The Elanders Business Connect API with Drop-Shipping Service allows users to print personalized content and send it directly to customers as a white-label solution. The API is listed under the Printing category. See ProgrammableWeb‘s complete list of Printing APIs.

The JW Player is a platform aimed at media professions for transmission of video content. It provides tools for implementing audience growth strategies and monetizing content with ads. The JW Player Platform Management API supports the programmatic modification of libraries and simplifies user connectivity. The JW Player Delivery API allows users to accelerate content transmission to destination sites or apps by bundling responses. Apart from generating JSON and RSS feeds for single content items and playlists the API also facilitates the transmission of images, single-line player embeds, tracks, and images. Both JW Player APIs are listed under the Video category. See ProgrammableWeb‘s complete list of Video APIs.

cdnjs is an open source community driven JavaScript and CSS directory and CDN (content delivery network). The cdnjs API cdnjs allows developers to query the cdnjs front-end CDN resource. This community project provides a way to organize front-end web development resources and provide them to the developers, using a fast CDN infrastructure without usage limitation or fees. This API is listed under the Directories category. See ProgrammableWeb‘s complete list of Directories APIs.

The Directory of Open Access Journals (DOAJ) is a community-curated list of high quality, peer-reviewed, open access, online journals. DOAJ API provides access to the datastore that backs DOAJ. It includes Elasticsearch who’s query syntax provides a way to send more advanced queries to DOAJ. The DOAJ API is listed under the Open Data category. See ProgrammableWeb‘s complete list of Open Data APIs.

PayUmoney is a payment gateway and billing services platform. The PayUMoney API returns data of payment inquiries. Developers can make calls over HTTP with JSON format, authenticating via merchant key. Version 1 Swagger is available for download. The API is listed under the Payments category. See ProgrammableWeb‘s complete list of Payments APIs.

The Weatherbit Historical Weather API provides hourly historical weather information based on cities, postal codes, IP address, and more. The API supports up to 5 years of historical data from over 380,000 cities. This data is quality controlled and corrected. The API is listed under the Weather category. See ProgrammableWeb‘s complete list of Weather APIs.

Get temperature, wind, precipitation, and sea level information via Weatherbit Historical Weather API

 Weatherbit Historical Weather API
Weatherbit Historical Weather API Image: Weatherbit

SmallStep provides learning technology based on Machine Learning and Artificial Intelligence technologies. Their EnglishGrammar API provides grammar analysis by generating multiple choice exercise for each sentence to study verbs. The API is able to detect grammar patterns in the input text and provide a recommendation to train on those specific grammar patterns, like past or future tense. This API is listed under the Education category. See ProgrammableWeb‘s complete list of Education APIs.

Australia’s Domain Group has launched a public API for access to Australian Real Estate data. The Domain Group Agencies and Listings API allows access to an extensive database of residential, commercial and business real-estate agencies and agents, along with on-market and off-market property listings. The Domain Group Content API allows access to editorial content covering news and trends across the property eco-system. Use the API to retrieve interesting happenings in particular locations, profiles on unique buildings, property market news, latest designer trends and more. The Domain Group Properties and Locations API allows access to Domain Group’s extensive database of property data, including functions for address lookup, eemographics, sales and more.

Routee provides a platform and APIs for adding SMS, voice, email, and two-factor authentication functions to applications. The Routee API integrates those services and uses OAuth 2 as the authentication method. The API is listed under the Messaging category. See ProgrammableWeb‘s complete list of Messaging APIs.

Also new to the Messaging category are the Mobivate, Semaphore, and Dragonfly SMS APIs. With the Mobivate API, developers can send messages, send a batch of messages, get routes and prices, and verify numbers. The Semaphore API integrates SMS with one line of code. And the Dragonfly SMS API is available as a .NET software with HTTP and SOAP protocols for messaging.

Vyking provides mobile immersive advertising platforms and solutions. The Vyking API allows developers to access and integrate the functionality of Vyking with other applications. API access comes with account service. The API is listed under the Advertising category. See ProgrammableWeb‘s complete list of Advertising APIs.

 

Vyking API lets developers add augmented reality ads to apps
Vyking API lets developers add augmented reality ads to applications Image: Vyking

Squarespace provides a platform for eCommerce website, mobile app, and other software solutions. The Squarespace Commerce API helps to simplify the development of Squarespace store data management applications. API endpoints facilitate the retrieval of orders among other ecommerce functionalities. Squarespace Commerce API is listed under the eCommerce category. See ProgrammableWeb‘s complete list of eCommerce APIs.

Broadsoft offers a business collaboration application named Team-One. The platform allows for business chat (group and private messaging), persistent workspaces, file sharing, contextual intelligence, task management, and live meetings. The Broadsoft Team-One API connects an application with Broadsoft’s workspaces in order to post chat messages, tasks, and notes. This API is listed under the Chat category. See ProgrammableWeb‘s complete list of Chat APIs.

Forte provides a platform for payment solutions and a way to build scalable, secure payment applications. We’ve added four Forte APIs to our directory in the Payments category, as listed here:

Forte Checkouts API enables applications to accept payments with a few lines of code. It creates a customizable JavaScript “Checkouts Pay Now” button, in which developers can customize checkout features. Forte’s Checkouts supported browsers are Internet Explorer, Firefox, Chrome and Safari.

Forte Payments Advanced Gateway API allows applications to capture purchase information via swipe or key entry, processes credit card, EFT, and recurring transactions, respond to your point-of-sale machine approving or denying the transaction and uploads completed transaction information to Forte’s Virtual Terminal application.

Forte Payments Webhooks API allows developers to add notifications for merchants regarding events during transactions such as sales transaction information, customer information, and payment method.

Forte’s SOAP API provides merchant services for managing clients, transactions, payment method tokens, searching hierarchy data and more.

Gavagai provides Natural Language Processing and Artificial Intelligence technology backed by years of research. The Gavagai API provides developers with tools to make data more intelligible. The API supports a number of different languages. Each language is actively tracked online to enable API features, including sentiment extraction, currently used multi-word expressions and matching topics, information about words and their relationships through keywords, and more. The API is listed under the Natural Language Processing category. See ProgrammableWeb‘s complete list of Natural Language Processing APIs.

Gavagai Living Lexicon shows similar words and other associations
Gavagai Living Lexicon shows similar words and other associations Image: Gavagai

Event 21

Join in to listen to Dr. Yoram Levanon speaking at

WORLD’S BIGGEST AI ONLINE CONFERENCE FOR DEVELOPERS

AI WITH THE BEST 100 SPEAKERS – 2 DAYS – 4 TRACK

14-15 October 2017


click here
for more info


Mail us

Event 20

Come meet our team

and build deeper connections

at Emotion AI Summit by Affectiva

September 13, 2017 MIT Media Lab
Cambridge, MA


click here
for more info


Mail us

API Analysis Result Interpretation

Analysis Result Interpretation Guide

The UPSTREAM and ANALYSIS requests returns a JSON object which contains analysis result.
Following table summarizes fields, values and their descriptions of the returned JSON object.

JSON Object Field Description Version Support Notes
{
“status”:”success”, The status of request. Can be “success” or “error”.
“result”:{ The object of analysis results.
  “duration”:”21513.25″, Duration of voice data processed in milliseconds
  “sessionStatus”:”Done”, Session status can be:
“Started” – no analysis data yet produced,
“Processing” – intermediate results , more analysis can be expected,
“Done” – analysis session has ended, the result has an analysis results for whole session.
  “analysisSegments”:[ The array containing analysis segments
  { First analysis segment object. Following fields are  properties of the segment
     “offset”:0, Offset of the segment in milliseconds from the beginning of the session.
     “duration”:10000, Segment duration in milliseconds
     “end”:10000, The end of the segment in milliseconds V4 and above
     “analysis”:{ Analysis object. Contains analysis values for the segment. The content of the object is provided as example. The real fields can vary depending on license type
         “Temper”:{ Temper Object
           “Value”:”21.00″, Value of Temper
           “Group”:”low”, Group of Temper, “ambiguous” – means that value cannot be calculated for the slice ambiguous value: V4 and above
           “Score”:”92.00″, Confidence score of Temper (92 % positive) V4 and above
},
        “Valence”:{ Valence Object. (similar to Temper object)
           “Value”:”23.00″,            Value of Valence
           “Group”:”negative”, Group of Valence, ambiguous – means that value cannot be calculated for the slice ambiguous value: V4 and above
           “Score”:”94.00″, Confidence score of Temper (94 % positive) V4 and above
},
        “Arousal”:{ Arousal Object. (similar to Temper object)
           “Value”:”24.00″, Value of Arousal
           “Group”:”low”, Group of Arousal, ambiguou – means that value cannot be calculated for the slice ambiguou value: V4 and above
           “Score”:”80.00″, Confidence score of Arousal (94 % positive) V4 and above
         },
        “Mood”:{ Mood Object, Contains Mood Group objects
           “Group7”:{ Mood Group 7 Object
           “Primary”:{ Primary mood of Mood Group 7
              “Id”:7, Id of the phrase
              “Phrase”:”Worried” Phrase (Primary Mood Group 7 Phrase)
           },
           “Secondary”:{ Secondary
              “Id”:4, Id of the phrase
              “Phrase”:”Frustrated” Phrase (Secondary Mood Group 7 Phrase)
           }
        },
           “Group11”:{
           “Primary”:{
              “Id”:3,
              “Phrase”:”Defensivness,Anxiety”
           },
           “Secondary”:{
              “Id”:7,
              “Phrase”:”Loneliness,Unfulfillment”
           }
        },
           “Group21”:{
           “Primary”:{
              “Id”:21,
              “Phrase”:”unhappiness”
           },
           “Secondary”:{
              “Id”:16,
              “Phrase”:”loneliness”
           }
        },
        “Composite”:{ Composite Mood Object
           “Primary”:{
              “Id”:274,
              “Phrase”:”Painful communication. High sensitivity.”
           },
          “Secondary”:{
             “Id”:241,
             “Phrase”:”Longing for change. Seeking new fulfillment. Search for warmth.”
           }
        }
      }
    }
  },
{, Following analysis segment objects
……
},
],
 “analysisSummary”:{ The object of analysis summary
  “AnalysisResult”:{ The object of analysis summary results
  “Temper”:{ Temper Summary Object
  “Mode”:”low” Most frequent Temper Group
  “ModePct”:”100.00″ The Percentage of the most frequent Temper group
  },
  “Valence”:{ Valence Summary Object
  “Mode”:”negative” Most frequent Valence Group
  “ModePct”:”100.00″ The Percentage of the most frequent Valence group
  },
  “Arousal”:{ Arousal Summary Object
  “Mode”:”low” Most frequent Arousal Group
  “ModePct”:”100.00″ The Percentage of the most frequent Arousal group
  }
  }
  }
  }
 }

CFD 373 – Emotions drive everything we do; Go beyond verbal with Yuval Mor

CFD 373 – EMOTIONS DRIVE EVERYTHING WE DO; GO BEYOND VERBAL WITH YUVAL MOR

Emotions Analytics change the way we interact with our machines and ourselves – forever. By decoding human vocal intonations into their underlying emotions in real-time, Emotions Analytics enables voice-powered devices, apps and solutions to interact with us on an emotional level, just as humans do.

Beyond Verbal focuses on raw vocal modulations – probably the most expressive output our body produces. This makes our offering unique by being non-intrusive, continuous in nature and easy to implement. Enabling wearables and digital health applications and empowering international brands to better understand consumer brand interaction, Beyond Verbal is pioneering a brand new breed of emotionally powered devices set to change the way we do business, make decisions and manage our lives, forever.

http://cashflowdiary.com/cfd-373-emotions-drive-everything-we-do-go-beyond-verbal-with-yuval-mor/

致力于情绪分析技术,这家公司要让 Siri 和 Alexa 理解你的喜怒哀乐

  Beyond Verbal 在 2012 年成立于 Tel Aviv,团队拥有数十年的情感分析研究经验,并与芝加哥大学,Mayo Clinic,Scripps 和 Hadassah 医学中心等知名组织进行了合作。

  2014 年,该公司推出了 Beyond Wellness API,能够让用户的智能手机或是带有麦克风的其他可穿戴设备发挥情绪传感器作用,公司研发的分析技术并不是进行语音实际文本内容或上下文语境的分析,而是通过系统创建的算法识别音域及语调变化,分析出像愤怒、焦虑、愉快或满足等情绪,这样就可以通过用户语音样本来了解人体情绪及人体身心健康状况。

  通过一段时间的被动跟踪、量化和报告情绪状态,Beyond Verbal 能够提示用户一段时间内的全面情绪健康状态,进而帮助用户改进情绪健康。目前,Beyond Verbal收集了包括 40 种语言内的 250 多万条“情绪标签声音”数据。

  不过该技术尚处于起步阶段,潜在用例包括呼叫中心通过分析通话情绪来改善与用户的关系,或在健康领域,承担评估某人的心理健康的角色。同时,确定它是否可以有效地用于检测身体状况(如心脏问题)的研究也正在进行中。

  如何让 Siri 进行情感分析?

  这种情绪检测的技术可以被应用到各种场合中:游戏的客户服务、约会服务(帮助人们知道对方是否真的对自己感兴趣)。同时,Beyond Verbal 目前正致力于通过开发人员的新 API 向虚拟个人助理(VPA)领域介引进其情感分析技术。 虽然苹果 Siri 和亚马逊 Alexa 在理解话语上正在不断改进,它可以对“Alexa,播一首披头士的歌”做出反应,但它们却不擅长识别人们的情绪。而这就是 Beyond Verbal 新 API 的最终目标:为数字助理带来“情绪智能”。

  “今天的数字世界正在迅速改变我们与技术等其他方面交流的方式。”Beyond Verbal 首席执行官 Yuval Mor 说道,“虚拟私人助理已经开始采用个性化体验。我们非常希望能够将 AI 和 Beyond Verbal 情绪分析的突破性技术融合在一起,为个性化技术和远程监控提供独特的视角。”

  那这如何实现呢?为什么亚马逊 Echo 设备需要了解人们的心情? 鉴于 Alexa 和 Co. 现在可以为第三方支持的语音服务提供支持,Beyond Verbal 提到了一些可能的用例:如果你的声音比较消沉,Alexa 可能会播放积极的音乐,或者可以告诉你,你的朋友听起来不太高兴,或者给你推荐令人愉快的电影。

  需要注意的是,Beyond Verbal 大部分的想法都是放眼未来,还有更多潜在的重要用途尚未被发现,比如医疗健康。在未来,Siri 可以在几周内观察到你的情绪不佳。或者如果目前的研究能重视其承诺,它甚至可以帮助人们检测严重的身体疾病。“在不久的将来,我们的目标是为我们的功能集添加声乐生物标志物分析,使虚拟私人助理能够通过你的声音来分析特定的健康状况。”Mor 补充说。

  实现过程是复杂的

  该技术在基本层面上亟待改进。事实上,在 Beyond Verbal 激活虚拟个人助理(VPA)与用户之间的对话时,首先需要 13 秒语音来进行第一次分析,之后则需要每 4 秒进行一次情感分析 ,此过程适用于每一次谈话。但很难想象人们会与 Amazon Echo 或 Apple HomePod 聊足够长的时间来启用情绪检测。这就对为什么每天被动地收集一个人的声音对于该项技术的成功而言至关重要。

  “目前,命令风格的对话会因为声音太少而难以进行情感分析。”VentureBeat 公司的一位发言人表示,他们正在开发一个额外的功能集,这将减少反应时间,不过还是不可用于商业用途。改善情绪检测的另一种方法是通过更广泛的物联网方式,使用多种设备来(包括可穿戴式,手机,智能车等)分析语音,但这仍然需要一段时间。

  据悉,Beyond Verbal 正在寻求新一轮融资,目前已经从基金中投入了大约 1080 万美元(其中包括 2016 年 9 月的 300 万美元融资)。如果该公司要实现将真正的情绪智能带给 AI 的愿景,那么将需要一笔巨大的投入。

第九届中国鞋服行业供应链与物流技术研讨会

http://www.56products.com/News/2017-6-26/IFFB6ACFC3AH622314.html

Beyond Verbal Adds Emotions to Virtual Assistants

Beyond Verbal has launched an API for virtual private assistants to help identify user emotions in real time.

Beyond Verbal, a provider of voice-driven emotions analytics, is launching a cloud-based API engine that will enable virtual private assistants (VPAs) to reveal customized recommendations based on individual moods and emotional states.

Beyond Verbal enables these assistants to understand the emotional message, context, and intent carried by users’ vocal intonations, which represent 35 percent to 40 percent of the emotions people convey in their communication. By integrating with Beyond Verbal’s API technology, virtual private assistants will now be able to understand and react to their users’ emotions all by the tone in their voices. With Beyond Verbal, AI assistants can also change their behavior, personality, and even their tone of voice  to fit themselves to the context of the conversation and person with whom they are communicating.

Beyond Verbal’s Emotional Analytics technology takes raw voice input and analyzes it for mood and attitude. The technology only requires 10 seconds of continuous voice input to render an emotional analysis. The operating system then measures the speaker’s tone of voice and the results are distributed into groups and analyzed in real-time.

“Today’s digital world is rapidly transforming the way we interact with our technology and each other. Virtual private assistants have begun to take on a personalized experience,” said Yuval Mor, CEO of Beyond Verbal, in a statement. “We are very excited for this next step in fusing together the breakthrough technology of AI and Beyond Verbal’s Emotions Analytics, providing unique insight into personalized tech and remote monitoring. In the not so far future our aim is to add vocal biomarker analysis to our feature set, enabling virtual private assistants to analyze your voice for specific health conditions.”


http://www.speechtechmag.com/Articles/ReadArticle.aspx?ArticleID=118992