February 29, 2016 | Written by: Frederic Lavigne
Categorized: Compute Services | How-tos | Watson
Share this post:
Imagine you are attending the Cannes film festival or visiting a capital and taking pictures. Wouldn’t it be great if when you are about to share these pictures with your friends and followers, the app automatically proposed hashtags by interpreting the picture, identifying buildings, landmarks and famous people?
While we wait for this capability to come in popular image sharing apps, let’s build something like this with IBM Bluemix:
- To analyze the images, the IBM Bluemix catalog provides us with the Watson Visual Recognition and AlchemyAPI (more specifically AlchemyVision) services from IBM Watson. You provide the API with an image (URL or raw data) and in return you get a list of tags or keywords with a confidence score. Watson Visual Recognition can even be trained for fine-grain classification,
- For the app, we will pick iOS as our first target; this will be an opportunity to develop with Swift,
- Given IBM Bluemix OpenWhisk was just announced at IBM InterConnect, all of the image processing and analysis will be running as an IBM Bluemix OpenWhisk action, outside of the app logic code, with no server to set up and reusable by others.
Sign up for Bluemix. It’s free!
Voilà, a sample iOS application to automatically tag images and detect faces by using IBM visual recognition technologies:
- Take a photo or select an existing picture in the camera roll,
- Let the application generate a list of tags and detect people, buildings, objects in the picture,
- Share the results with your network.
See how it was done!
The source code, documentation and instructions to run the application are available in the IBM-Bluemix/openwhisk-visionapp project on GitHub.
While working on this, a colleague asked about the choice of IBM Bluemix OpenWhisk to implement the processing with a question: “Why not call the Watson services directly from the application?” Indeed, this would also work, at least at first since today we’re only considering an iOS app. Now let’s consider this is a real business—you will want to target other systems like Android, Windows Phone or even a more traditional web app. Do you want to have to rewrite the logic in several different languages? What if you want to tune the results a bit before displaying them? And if this image tagging microservice becomes successful, you could even consider providing it as an API for others to consume and integrate (many photo library software would benefit from a well-trained automatic tagging capability). In these cases, you would not want to have to manage the scalability of the service. Instead, you would leave that to IBM Bluemix OpenWhisk to handle transparently.
If you have feedback, suggestions, or questions about the app, please reach out to me on Twitter @L2FProd.
If you want to learn more about OpenWhisk, visit our OpenWhisk development center. If you want to see OpenWhisk running in IBM Bluemix, sign-up for the experimental Bluemix OpenWhisk where their motto is “Post your code. We host it. We scale it up. Pay only for what you use.” 🙂
Sign up for Bluemix. It’s free!