Safe Harbor Statement: The information on IBM products is intended to outline IBM's general product direction and it should not be relied on in making a purchasing decision. The information on the new products is for informational purposes only and may not be incorporated into any contract. The information on IBM products is not a commitment, promise, or legal obligation to deliver any material, code, or functionality. The development, release, and timing of any features or functionality described for IBM products remains at IBM's sole discretion.
Tony Pearson is a an active participant in local, regional, and industry-specific interests, and does not receive any special payments to mention them on this blog.
Tony Pearson receives part of the revenue proceeds from sales of books he has authored listed in the side panel.
Tony Pearson is not a medical doctor, and this blog does not reference any IBM product or service that is intended for use in the diagnosis, treatment, cure, prevention or monitoring of a disease or medical condition, unless otherwise specified on individual posts.
The developerWorks Connections platform will be sunset on December 31, 2019. On January 1, 2020, this blog will no longer be available. More details available on our FAQ.
IBM InterConnect - IBM Research Day presentations on Day 0
This week, I am attending the [InterConnect Conference] in Las Vegas, Feb 21-25, 2016. This is IBM's premier Cloud & Mobile conference for the year.
Sunday, I attended a series from IBM Research talking about the latest research areas.
7110A Future Directions in Enterprise Mobile Computing
Gabi Zodik (IBM) presented. Mobile and wearables are transforming all industries. Enabling technologies are required to support the new computing models that are cognitive in nature. Real-time proactive decisions can be made based on the mobile context of a user. Driven by the huge amounts of data produced by mobile devices, the next wave in computing will need to exploit data and computing at the edge of the network.
Future mobile apps will have to be cognitive to "understand" user intentions based on all the available interactions and unstructured data. A new distributed programming paradigm is emerging to meet these needs, which has to deal with massive amounts of data and devices. While the compute and storage capacity on individual devices is small, collectively they exceed all of the servers and storage in Cloud datacenters.
7107A Wearables in the Enterprise
Asaf Adi (IBM) presented. Wearable technology is booming. It is only our imagination that will limit the number of industrial, military, consumer and healthcare applications for this new emerging technology. Wearables are transforming industries and professions, enabling new business opportunities. From a show of hands, half the audience was wearing smart technology already.
In one example, he focused on construction industry. In the USA alone, there are thousands of workplace injuries, costing $190 Billion dollars. Wearable technologies can be incorporated into a hardhat to bright orange vest. In a steel mill, heat stress can be determined from ambient temperature and an employee's heart rate. Over time, we will have multiple wearables, communicating to each other.
In another example, he was able to make a hand gesture (waving his hand in front of his smartphone), and use that to generate code fragment that can be used by software developers to detect that particular hand gesture was made in any application.
Wearables cannot assume they are always connected to the Cloud. Take for example mining, where miners are deep below the ground. Technology to ensure safety needs to work regardless of connectivity.
Privacy is also a big concern. Wearables should not be used by employers to monitor every movement and activity of the employees.
7152A Cognitive IoT -- Today, Tomorrow and Beyond
Alessandro Curioni (IBM) presented. Today's sensors aren't up to the task of unlocking the complex links between people, places and things. To reach the next level, we need technologies that enable them to gather and integrate data from many sources, to reason over that data, and to learn from it. IBM calls this the Cognitive Internet of Things (IoT).
We already know IoT data can be used to predict maintenance needs, but what if it can also help designers engineer more reliable products from scratch? In addition, with advancements in nanotechnology and machine learning we can bring the power of cognitive to the edge—where the data is collected. Imagine tiny edge computers providing Watson services on every sensor?
It is estimated that we have 13 billion IoT sensors today, and that this will more than double to 29 billion by year 2020. This introduces new security threats, new levels of employee engagement, and fundamental shifts in business models.
Sadly, 88 percent of all the IoT is dark, meaning that it is not collected or processed for analysis. While the IT industry has done amazing things with the other 12 percent, we realize that programming techniques are too limited.
That is why cognitive is needed to unleash the value of the data. IBM Watson offers excellent capabilities, including Natural Language Processing (NLP), Machine Learning (ML), Image/Video analytics, and Text Analytics.
Manufacturers like Whirlpool are investigating use of IoT for home appliances, like refrigerators, washers and dryers. This is just the beginning, other industries including Healthcare, Retail, Oil, Mining and Farming will also benefit.
7108A Blockchain and the Future of Finance
Ramesh Gopinath (IBM) presented. Transferring products and funds today is inefficient, expensive, and vulnerable. Blockchain is an emerging fabric for transaction services. It has the potential to radically transform multi-party business networks, enabling significant cost and risk reduction and innovative new business models.
About 18 months ago, the "Blockchain" concept was not ready for business. Since then, Apache has accepted the "HyperLedger" project, with 17 founding companies.
Imagine a company in China or India exporting a product to a company in USA. There may be 10 or some companies or agencies involved, including multiple banks, port authorities, trucking companies, etc. The hand off the equipment, and ensure all parties are paid, some 30 different paper documents may be needed. Each company maintains their own set of records, and all the middlemen take their cut.
Blockchain represents a digitally-signed, encrypted, immutable "ledger" that records all of the steps related to a particular transaction. Since each new block has a checksum of all of the previous blocks, it prevents tampering and fraud. All parties have access to all of the ledger, eliminating discrepancies between different repositories of records.
This can be used to sell stocks, buy real estate, or transfer financial funds to your family overseas. Each party involved in a Blockchain has a node in a peer-to-peer network of nodes that can access a shared Blockchain request. A user initiates the transaction, and the nodes in the network use a Practical Byzantine Fault Tolerance [PBFT] protocol.
By providing [disintermediation], fewer middlemen in the process reduces costs, processing time, and risks. The method allows for the user's transactional privacy, but also ensures accountability and auditability.
7234A Building Cloud Infrastructure for Next-Generation Workloads
Krishna Nathan (IBM) presented. Today's cloud providers are efficient at providing today's cloud services at low costs. However, this efficiency comes with the penalty of inflexible instance types and no real guarantees on performance or quality of service.
Today's systems are organized and optimized for transactional processing, a result of evolution of the past 60 years. Relational Databases offer specific features like Atomicity, Consistency, Isolation, and Durability, known collectively as [ACID].
However, we are expanding beyond "automating our world", or "understanding our world". This means tapping into 90% unstructured workloads, multi-modal scanning, noise-tolerant with variable precision and probabilistic outcomes.
Cloud Providers have used the "best practices" of transactional datacenters. Consequently, next-generation workloads that often do not share the characteristics of traditional workloads are limited in expressing their full potential because of these infrastructure limitations. Now they need to focus on four characteristics: Locality, Composability, Heterogeneity, and Dynamic resource allocation.
New workloads need a combination of CPU, GPU, NVMe, and other resources. How do you schedule which equipment to deploy for incoming workload processing that optimizes performance? By taking these factors into account, clever Cloud providers can optimize performance results to provide best fit for each workload request.
7135A Storing and Using Data in the Cloud -- Putting Together the Puzzle Pieces
Michael Factor (IBM) presented. What do OpenStack Swift, Spark, CouchDB, Kafka and ElasticSearch have in common? They are all open source, they all are available on IBM's cloud today, and they all focus on storage and using data. The trick, though, is putting these puzzle pieces together to solve real problems. You need smart integration between data services motivated by real examples from domains such as IoT, transport and retail.
There are a plethora of of open services to manage data. A recent IDC Analyst study indicates that the worlds data will grow from 8.6 Zetabytes today to 40 Zetabytes in 2020. Michael gave some eye-opening comparisons. If the data was stored on 10-TB hard disk drives, we could make some physical comparisons:
Imagine stacking all of those disk drives one on top of each like a stack of books. the stack today would be 22,000 kilometers, more than half the way to geosynchronous orbiting satellites, but would be over 100,000 kilometers, way past those satellites in 2020.
The weight of those drives today would be comparable to the weight of 1,450 Airbus 380 airplanes. In 2020, they would weigh 6,755 Airbus 380 airplanes.
If the drives were spread across the entire Mandalay Bay convention center floor, they would be 1.7 meters deep today (about 5 feet), but would be 8 meters deep in 2020.
An example of the EMT Madrid bus company using real-time sensors to react to traffic conditions.
Here are the various pieces:
OpenStack Swift -- provides object storage
ElasticSearch, based on Apache Lucene - search engine, such as for metadata or queries
Apache Spark - combines SQL, streams and complex analytics, with filter pushdown support
Apache Parquet -- a column-based data format to replace row-based Comma-Separated-Variable (CSV) format
Apache Kafka - a message bus, works with dashDB and Secor
Beyond programming "glue", we need smart integration to get an order of magnitude boost in performance.