Interest in containers has surged in the DevOps community since IBM co-founded the Cloud Native Computing Foundation in 2015. Innovation in open cloud technology creates challenges with interoperability and integration. The Foundation is run by developers, for developers to minimize those challenges by promoting cloud native applications and services.
On Sunday, May 28th Roland Garros (French Open) opened its gates to the latest running of the tournament. As one of the four Grand Slam events in Tennis, there is a keen focus on ensuring the event runs smoothly and provides fans with the experience and insights they would want from such a prestigious event. This year IBM has provided a solution using Watson Analytics to help the content team understand and review key web metrics to help guide decision making throughout the event.
In the previous BlueChatter post, we looked at deploying and scaling a Node.js chat app using Cloud Foundry and Docker Container approach. In this post, we will look at how to deploy, scale and manage a Kubernetes application. We will reuse the same BlueChatter node.js application for this example.
Nexmo, The Vonage API Platform, provides tools for voice, messaging and phone verification, enabling you to embed programmable communications into mobile apps, websites and business systems. Nexmo APIs include SMS text messaging, two-factor authentication, voice, chat, and social media connect.
Serverless computing and Watson service chaining via OpenWhisk : Part 3 of 3 expose an action or sequence
By now, you should be aware of what OpenWhisk is and leverage OpenWhisk Sequence to chain Watson services. Also, you should have created Swift and NodeJS actions for transforming the JSON to required formats. In this post, you will learn how to expose an action or a sequence (Chain of actions) as a RESTful endpoint via OpenWhisk API Gateway and OpenWhisk CLI.
This year, with the opportunity of a new working location within the new press building, our team undertook an exercise to re-imagine the client; members and guests experience for the IBM at The Masters Program. The focus for us was to bring IBM’s cognitive ambition and capabilities to the forefront, to demonstrate that Watson can be applied to every activity and touch point of an experience, and improve it. The idea formed from this approach became known as our “Cognitive Room”.
The IBM Watson Visual Recognition service can obviously tag images for content, recognize faces, and find similar images, but that’s not all it can do. If the condition you want to identify is only within a smaller region of a larger image, the entire image might not be classified with high enough confidence and a positive result could be missed. This post shows you how you can improve Watson Visual Recognition's ability to detect finer details.
We just introduced The Bluemix Developer Console. Extending the current Bluemix Mobile Dashboard, this new experience goes beyond mobile and introduces new tools for quickly creating Cloud Native applications across web, mobile and backend. They aim to greatly cut down on development time by generating application starters with all the necessary boilerplate, build and configuration code, so that developers can start coding business logic faster.
For many developers, the Hello World starter applications on Bluemix are too basic and the sample applications on IBM-Bluemix.github.io page are a bit too advanced. If you agree with this, you'll find our recently released Runtime Getting Started guides extremely helpful.