Build a face recognition app for iPhone

Share this post:

Imagine you are attending the Cannes film festival or visiting a capital and taking pictures. Wouldn’t it be great if when you are about to share these pictures with your friends and followers, the app automatically proposed hashtags by interpreting the picture, identifying buildings, landmarks and famous people?

While we wait for this capability to come in popular image sharing apps, let’s build something like this with IBM Bluemix:

  • To analyze the images, the IBM Bluemix catalog provides us with the Watson Visual Recognition and AlchemyAPI (more specifically AlchemyVision) services from IBM Watson. You provide the API with an image (URL or raw data) and in return you get a list of tags or keywords with a confidence score. Watson Visual Recognition can even be trained for fine-grain classification,
  • For the app, we will pick iOS as our first target; this will be an opportunity to develop with Swift,
  • Given IBM Bluemix OpenWhisk was just announced at IBM InterConnect, all of the image processing and analysis will be running as an IBM Bluemix OpenWhisk action, outside of the app logic code, with no server to set up and reusable by others.

Sign up for Bluemix. It’s free!

Voilà, a sample iOS application to automatically tag images and detect faces by using IBM visual recognition technologies:

  • Take a photo or select an existing picture in the camera roll,
  • Let the application generate a list of tags and detect people, buildings, objects in the picture,
  • Share the results with your network.

See how it was done!

The application is built with Cloudant, Watson Visual Recognition and AlchemyAPI from the Bluemix catalog to store and process images. And obviously all the backend part is implemented with IBM Bluemix OpenWhisk as a JavaScript action.

The source code, documentation and instructions to run the application are available in the IBM-Bluemix/openwhisk-visionapp project on GitHub.

While working on this, a colleague asked about the choice of IBM Bluemix OpenWhisk to implement the processing with a question: “Why not call the Watson services directly from the application?” Indeed, this would also work, at least at first since today we’re only considering an iOS app. Now let’s consider this is a real business—you will want to target other systems like Android, Windows Phone or even a more traditional web app. Do you want to have to rewrite the logic in several different languages? What if you want to tune the results a bit before displaying them? And if this image tagging microservice becomes successful, you could even consider providing it as an API for others to consume and integrate (many photo library software would benefit from a well-trained automatic tagging capability). In these cases, you would not want to have to manage the scalability of the service. Instead, you would leave that to IBM Bluemix OpenWhisk to handle transparently.

If you have feedback, suggestions, or questions about the app, please reach out to me on Twitter @L2FProd.

If you want to learn more about OpenWhisk, visit our OpenWhisk development center. If you want to see OpenWhisk running in IBM Bluemix, sign-up for the experimental Bluemix OpenWhisk where their motto is “Post your code. We host it. We scale it up. Pay only for what you use.” 🙂

Add Comment

Leave a Reply

Your email address will not be published.Required fields are marked *


This is exciting and will like to know if the system can be trained to recognize sometimes not so famous personalities.. Essentially, can I load my client images for recognition (private data of course..)


Yves Le Cléach

Following InterConnect 2016, I was trying Openwhisk and Swift on Bluemix. I was searching a good demo that implement both, and you did it in a very nice way. I just finish your tutorial in less than 30 min ! and the result is AWESOME !
Now I have to deep in the code… Thank you very much Frédéric ! I can’t wait your next one !
Note : check out the SOmusic article :


David Hicks

Very nice tutorial. I tried the app but I get TypeError: Cannot read property ‘use’ of undefined” for the openwhisk activation. stderr: at mainImpl (eval at NodeActionRunner (/nodejsAction/runner.js:32:21)


    David Hicks

    I have to make two changes(ServerlessAPI.js) to get this to work. First, pass your Cloudant url and name to whisk. Second, fix the object reference to the returned JSON result.

    try whisk.invokeAction(name: ActionName, package: nil, namespace: ActionNamespace,
    parameters: ([ “imageDocumentId”: documentId, “cloudantUrl”: CloudantUrl, “cloudantDbName”: CloudantDbName] as AnyObject),
    hasResult: true) { (reply, error) -> Void in

    onSuccess(Result(impl: result[“response”][“result”]))



This is a nice tutorial.

I did the same kind of project for an Android App. Thanks to the IBM tutorials it took 3 days, even if I still have a lot of debugging to do.

Let me know if you want the code, I can share.

More Watson stories

A predictive Machine Learning model from Build to Retrain

This post is an excerpt from our solution tutorial that walks you through the process of building a predictive machine learning model, deploying it as an API to be used in applications, testing the model and retraining the model with feedback data. All of this happening in an integrated and unified self-service experience on IBM Cloud.

Continue reading

How to organize users, teams and applications in IBM Cloud

Learn how to organize your project with multiple deployment environments

Continue reading

How to build a cloud native app in 30 minutes

I’m one of the IBM Cloud App Service architects and have been involved in leading the team in the engineering and delivery of this tool for accelerating development of cloud native apps. After giving an overview of app architecture and stating the game plan,  I’ll take you through video demos of creating two microservices and an iOS […]

Continue reading