Watson

Watson Tone Analyzer API Goes GA with Improved Models

Since Watson Tone Analyzer went beta earlier this year, our team has been busy working with sponsor users to validate and improve the service. We’ve had a lot of great feedback, and we’re excited to launch a Generally Available (GA) version of Watson Tone Analyzer that includes service improvements to enable tone detection in a broader variety of communications.

Improved tone detection models

The new GA service has improved textual tone detection models across the categories of Emotion, Language and Social. Specifically, the GA version of Watson Tone Analyzer has expanded context sensitivity. This updated service goes beyond using lexical tokens to interpret tones. We have incorporated additional features such as punctuations, emoticons, and language parameters. As a result, the service is able to provide more robust tone insight.

Our emotion detection models have been trained on an even larger dataset to include more features like emoticons and slang words. Many of our sponsor users were using Tone Analyzer on social media posts and it was key that our emotion models could recognize tones in the style of communication commonly used in social media.

In addition, our models have been improved to handle negation. If there is negation in a sentence, Tone Analyzer picks up on this signal when interpreting the tones present in the communication:

Example Document Tone Summary

This improvement has helped to drive the accuracy of our tone outputs at both the sentence and document level.

Watson Tone Analyzer: transforming the customer care space

We’ve seen many creative applications of Watson Tone Analyzer, ranging from Connectidy using it in relationship science to fashion house Marchesa integrating the technology into their ‘cognitive dress.’ One area that we see Tone Analyzer unlocking a significant amount of value is in the Customer Care Space. Tone detection can considerably enhance our understanding of users’ emotional or social states. Understanding a user’s tone can be pivotal in improving user interactions. Below we’ve highlighted two innovative examples:

Removing the ‘Tone Deafness’ in Digital Agents

As it becomes more common for humans to interact with machine conversational agents, it becomes even more important to ensure that these interactions don’t loose that ‘human touch.’ Next IT is addressing this challenge with its intelligent, conversational interfaces and Watson Tone Analyzer. Next IT is focused on developing tools and methods to improve machine conversational agents, and is using Tone Analyzer to enhance the interactions by better understanding user’s feelings and needs:

Next IT Conversational Interface

The insight from tone analyzer is being used to provide a personalized, tone-appropriate Virtual Assistant interaction that resonates with each user, at any time.

Understanding tone to improve call center experience

iQVentures is a contact center intelligence company that has developed a platform to get more valuable insights from recorded contact center calls as seen in the dashboard below:

Snapshot of IQ Venture’s Contact Center Dashboard

Using Watson Tone Analyzer, iQventures was able to add a tone filter to their platform, enabling customers to search for calls with certain tone levels. By keying in on calls that indicate high levels of anger, sadness, or disgust (tones) customer service teams are able to identify specific calls which required additional attention. This capability has enabled iQventure’s customers to drive faster conflict resolution, reduce repeat calls, and ultimately improve customer satisfaction.

Try out Watson Tone Analyzer

Try out Watson Tone Analyzer today and build your own creative Tone Analyzer apps. Check out the Watson Tone Analyzer demo to get a quick overview of the service. We would love to hear from you! Leave your comments below.


The team responsible for tone analyzer includes: Hernan Badenes, Richard Gabriel, Pritam Gundecha, Shubhanjan Shekhar, Zhe Liu, Rama Akkiraju, Jalal Mahmud, Vibha Sinha, Steffi Diamond, Desiree Garcia, Alexis Plair. Alisha Lehr is the lead product manager.

Share this post:

Add Comment
No Comments

Leave a Reply

Your email address will not be published.Required fields are marked *

More Watson Stories

Watson Developer Conference 2016 Showcased Android Cognitive Labs

WDC 2016 in San Francisco allowed developers to learn about SDKs (incl. Android), deep learning, and the roadmap of Watson cognitive services.

Continue reading

A Unified Vision API

Today the AlchemyVision and the Visual Recognition service merged to create a unified Watson Visual Recognition API that uses deep learning algorithms to analyze images, identify scenes, objects, faces, text, and other content.

Continue reading

Connect native Android app to Watson Sentiment Analysis in under 10 minutes

This blog post describes how you can add Watson's cognitive service to an Android app in about 10 minutes.

Continue reading