How-tos

Discover dark data behind videos with OpenWhisk and Watson

Share this post:

Republished from the OpenWhisk blog


Video should soon represent up to 90% of all consumer internet traffic. This is a lot of information, often referred to as “dark data”, that is not simply searchable like a row in a database. In a previous post, I’ve looked at image tagging and face detection with IBM Watson Visual Recognition and Alchemy API. What if we could apply the same technologies to videos to make sense of these “dark data”?

That’s what I did, helped again by IBM Bluemix OpenWhisk and Watson services. The sample application, called Dark Vision, processes videos by extracting frames and tagging these frames independently.

OpenWhisk Dark Vision

(Source code is available in project IBM-Bluemix/openwhisk-darkvisionapp on GitHub).

Once all frames have been analyzed, a summary of the most frequent tags, building and faces is built for the video. The resulting tags and keywords could, for instance, be used to build a recommendation engine to suggest related videos or to display advertisements directly linked to the content of the video; alternatively, they could be used to improve search results. And as we can keep track of which tag appeared in which frame, we could also improve the viewer experience by skipping the video directly to the frame where a tag, or a face was first seen.

Check out this video to see the application in action:

Helped by cognitive technologies like Alchemy API and Watson Visual Recognition, we are able to automatically extract useful information from these videos without having to actually watch them. Combined with OpenWhisk, a cloud-first distributed event-based programming service, we built a system that can process these videos at scale without worrying about the infrastructure or the sizing of the system.

If you have feedback, suggestions, or questions about the app, please reach out to me on Twitter @L2FProd. If you want to see OpenWhisk running in IBM Bluemix, sign-up for the experimental Bluemix OpenWhisk.

Add Comment
No Comments

Leave a Reply

Your email address will not be published.Required fields are marked *

More Watson Stories

Build and deploy a MEAN stack application on IBM Cloud

MEAN is a collection of JavaScript-based technologies — MongoDB, Express.js, AngularJS, and Node.js — used to develop web applications. From the client and server sides to databases, MEAN is a full-stack development toolkit. This tutorial walks you through the creation of a web application using the popular MEAN stack. It is composed of a Mongo DB, Express web framework, Angular front-end framework and a Node.js runtime.

Continue reading

A hybrid Cordova mobile app with Push and Analytics in minutes

As promised while introducing "Tutorials to get your mobile development up and running", we are continuing our efforts to add more mobile solution tutorials. After quickly scaffolding a native iOS-Swift and Android mobile app with Push and Analytics, it's time to bring in the hybrid mobile app development flavor to the game with Cordova - Apache Cordova is an open-source mobile development framework. It allows you to use standard web technologies - HTML5, CSS3, and JavaScript for cross-platform development.

Continue reading

Use your own branded UI for user sign-in with App ID

With IBM Cloud App ID’s Cloud Directory feature, you can create a user registry, and add sign-up and email/password sign-in to your mobile or web app. Cloud Directory comes with a pre-built sign-in widget that you can use, or if you prefer to use your own branding, you can replace it with your own custom UI. This blog will show you how to use Cloud Directory APIs and add your own custom sign-in screen to an Android application. You can find the Android Cloud Land App on GitHub.

Continue reading