This 3-hour course on IBM’s developerWorks, provides a great overview of three of the most commonly used Watson services: Alchemy API; Visual Recognition; and Text to Speech. It then provides a step-by-step tutorial on how to use each service and extend the functionality of Swift-based, mobile apps. In this blog series, we will explore each service and tutorial so you can understand how and where it might be applicable to you.
So let’s start by looking at the Alchemy API. In the first part of the course, you will build a cognitive mobile application using Sentiment Analysis via the Alchemy API.
As you can see in the video above, this is not as difficult as it may seem. While the video shows the app built at an accelerated pace, most developers find it to be almost as easy as it looks.
As described, the Alchemy API offers a set of services that enable you to build apps which understand the content and context of text in webpages, news articles, and blogs. One of the most common use cases for cognitive applications is to build in functionality that will allow for personalization based on this understanding. To achieve this, the Sentiment Analysis capabilities of the Alchemy API are used.
We start with the Alchemy API and Sentiment Analysis first for two reasons:
it is an easier service to quickly started with
it provides a foundation for using the other Watson services and building more advanced functionality into your own apps
Demonstrating Sentiment Analysis
So what kind of things can you do with Sentiment Analysis? Let explore two demos and see for ourselves.
The first, is a simple Text Analysis Demo which allows you to analyze a text block or URL. Then determine whether the text and subjects represented in the text or content on the page are positive or negative. It is simple, but provide a quick illustration of how the service works.
The second, Your Celebrity Matchtakes this a bit further. It extracts content from social media feeds. Then, compares it against those of other individuals to compare and determine which feeds are most similar and most different. It is fun, but more importantly shows how you can take known data, analyze it and use it to influence how you interact with them.
Using this, you could predict behaviors and preferences based on this analysis. A retailer could make more personalized recommendations to their customers. And using existing data sources in your own databases, you can increase the accuracy.
In the part 2 of this blog series, we will discuss the Visual Recognition service and tutorial.
Over the past few years, we’ve seen a significant rise in popularity for intelligent personal assistants, such as Apple’s Siri, Amazon Alexa, and Google Assistant. Though they initially appeared to be little more than a novelty, they’ve evolved to become rather useful as a convenient interface to interact with service APIs and IoT connected devices.
In this post, I'll show you how to build a basic Spring app with Twitter login using Spring Social. Then we'll use Watson Tone Analyzer to determine the dominant emotion from each of the tweets on the time of the logged-in user. The project we will create will be similar to the Accessing Twitter Data Spring guide, but with a few modifications.
The Arria Natural Language Generation APIs service is an addition to the Finance category on the IBM Cloud platform. This blog post shows you how to get started with Arria’s Natural Language Generation APIs service on the IBM Cloud platform.