Behind the curtain: How cognitive computing works

By | 4 minute read | November 17, 2016

Few technologies offer the potential to impact our lives like cognitive technologies. Advances are continuously being made in artificial intelligence (AI) and machine learning, such as enabling airlines to predict micro-climate changes to improving the ability of doctors to recommend the most appropriate treatment options. According to an MIT Sloan study, 85% of enterprises view AI as a strategic opportunity for their business.

These advances in AI are not about a single technology–it’s a holistic view of the tools, the data and the advanced infrastructure required to unlock the insights from data and put them to use at the right moment in time. This “Holistic AI” is how market leaders will differentiate themselves in the cognitive era.

The magic in cognitive analytics

The goal of cognitive analytics is to leverage AI and ML to generate new insights and improve processes that allow you to thrive in today’s dynamic markets.

Machine learning makes it possible for an application to perform analysis based on patterns instead of traditional hand-coded rules. Suppose you want your application to look at pictures of people and then determine the ages of those subjects. So many variations exist that writing rules capable of producing results with better than 50 percent accuracy would be quite challenging.

However, if you feed thousands of images into a machine learning–based application, it can find enough patterns to make accurate predictions even though it can’t tell you what those patterns are. As the application gains experience with more input, the accuracy improves. And many uses for machine learning are available. For example, while cognitive computing still can’t predict tomorrow’s stock prices because just too much randomness in the market exists, banks are discovering that analytics based on machine learning can detect types of fraud that rules-based solutions can miss.

The data to drive insights

Finding these new insights comes down to data–massive amounts of it. Why? The greater the volume, types and sources of data, the more accurate and meaningful the results.

For example, oncologists require not only electronic medical record (EMR) data but also data from medical journals, image scans, partner hospitals and more. All this data combined with cognitive analytics enables the business–in this case the hospital–to sense, learn and adapt, resulting in better-served patients and more cost-effective data storage.

Several key requirements contribute to gaining a high rate of return from your data:

  • A data-centric architecture with storage capable of ingesting diverse data types from different sources: Cognitive analytics techniques such as machine learning depend on a wide variety of data ranging from images to text, voice and sensor readings.
  • Smart IT architectures designed to place analytics and data in close proximity: As the size of data grows, the cost of moving data around can become prohibitive. Instead, bring computation next to the data for new real-time analytics capabilities and improved cost efficiency and security.
  • An environment for connecting data from cloud services to systems of record running on premises: Organizations can bring data and cognitive analytics together in a scalable way using APIs in a hybrid cloud environment that mixes on and off-premises resources.

Modern infrastructure for the cognitive era

Today, remaining competitive requires acting at the speed of thought. That speed hinges on acquiring insights quickly, within intelligent infrastructures optimized and designed for cognitive analytics workloads.

At its core, cognitive computing is based on sophisticated mathematical models. For example, Kinetica has practically reinvented the database with its graphics processing unit (GPU)-accelerated database for quick analysis of large data sets. Because customer record sets are massive, Kinetica is giga-threading per node rather than multi-threading. This means calculations that took hours to days to compute, Kinetica can do in a few seconds. The GPU-based analytics acceleration has enabled Kinetica to make a fundamentally faster and more capable system, enabling retail clients to create better customer experiences.

Nima Neghban, Kinetica CTO states, “Your data is your enterprise and understanding your data means understanding your business.” Watch Kinetica and DJ Tim Exile create the sound of real-time retail.

To satisfy these demands, it comes down to a holistic view of AI, with the models, the tools and infrastructure specifically designed for data and compute intensive workloads.  A good example of this is the Watson Machine Learning Accelerator toolkit that make the benefits of AI more obtainable for your business, and the IBM Power Systems AC922 server, designed to crush the most data intensive workloads on earth.  With GPU accelerators and blazing fast I/O, these systems deliver the performance that underpin the world’s fastest supercomputers. And on the storage side, an all-flash array such as IBM FlashSystem can dramatically reduce latency

The term cognitive computing may seem a little vague at first but learning more about the technologies and IT infrastructure behind it can give you a more realistic view. And it is well worth checking into because cognitive computing is likely to be an important part of your business future. Start your cognitive journey now.

Most Popular Articles