The Internet of Things - Rambling Thoughts by Jim Fletcher
Many customers continue to measure temperature of their server racks using the "back of the hand" method. Unfortunately this is exactly what it says -- they walk the aisle with the back of their hand extended and when they feel a warmer than normal area, that is an area to be looked at further. Well, yes, that's not exactly scientific, but it has worked for years. Likewise power consumption was pre-determined from manufacturer's specs which generally means it was grossly overvalued.
But as we look at better optimizing our overall energy consumption, even a degree or two difference can make a big difference in our overall energy efficiency. The "back of the hand" method cannot provide that level of accuracy, so newer methods need to be implemented. Over the last few years, IBM has introduced direct measurement within their server family for both power consumption, as well as temperature reporting. With the direct reporting of this information, immediate and accurate information can be available and leveraged.
With the availability of more accurate information in a timely matter, datacenters can reduce their "energy buffer" . Typically customers have over-cooled, and over-powered. With the ability to detect even small deltas quickly and accurately, these buffers can be reduced and therefore overall energy consumption can be reduced.
But how does one get access to this information? Tivoli's ITM for Energy Management collects this server information from its embedded Active Energy Manager component. The data can be thresholded with events generated automatically when measured values exceed expected values. Reports can be generated or the information can be visualized in an operations console.
Having accurate and detailed information is just one element of an effective overall datacenter energy strategy -- but a very important one for sure.
Well despite months of desire, and a total lack of spare time - I finally become an official blogger today. Thanks to Jeff Jenkins for his help in getting this going.
Over the last two years, as I have pioneered the energy management space for Tivoli, I have seen leading organizations begin to recognize that the historical organizational structure around datacenters does not represent well the needs for improved energy management. Unfortunately, in most datacenters, the team responsible for cooling and power, and the team responsible for IT (servers, applications, storage, etc.) report into different lines of the business. Even more unfortunately from an energy management perspective, neither organization is responsible for paying the power bill, and in most organizations, neither team is even aware of the power bill.
As a result, there is no natural incentive to reduce overall power consumption, unless some external factor like availability of power comes into play. This "green organizational disfunctionality" results in wasted spending on energy, and operational inefficiencies given that there is also limited integration between the multiple organizations responsible for the datacenter. Even when knowledge does exist within the IT organization with regards to power consumption, I have yet to see a datacenter, where the IT team is measured in any way on power consumption - instead, availability and performance are the two measurements that matter.
So how can we expect energy efficient datacenters if organizationally there is little focus, and no incentives are provided to reduce spending on energy, Thats the challenge that organizations need to address. I am seeing an emergence of limited discussions between these multiple teams, and I am seeing an occasional "incentive" from the c-level exec to begin looking at how to reduce energy costs, but only occasionally. Instead, most energy reduction today is coming from "tangential" changes such as virtualization.
For those customers who have made a focus on looking at the entire energy consumption lifecycle, significant cost reductions -- sometimes approaching 40% have been seen.
fletchjibm 06000088KB 441 Views
I suspect many of you are scratching your heads today, saying "IBM bought the Weather Channel???". Well first off, we bought "The Weather Company" which includes an amazing cloud-based data ingestion and analytics platform, a very successful B2B business that includes a solution that makes all of my airline flights smoother by providing plane to plane awareness of bumpy air, a proven weather prediction capability that has some of the highest accuracy in the world, and a B2C business that includes a smartphone app that is the most used app in existence today.
And when we think of weather, its much more than knowing it will be sunny tomorrow. Its detailed micro-forecasts that allow us to know the specific weather for a specific local at any time of the day. Its the ability to be pre-warned of impending weather events like extreme winds, lightening, or hail storms. Its heat indexes, windchill, dew points, pollen indexes, UV indexes, etc etc etc. All of these elements of weather when coupled with the awareness that the Internet of Things will bring as everything becomes connected created so many new opportunities.
Now lets think about all of this data in a world that has the Cognitive Insight capability that IBM Watson provides and will grow to provide ... the possibilities are truly endless.
fletchjibm 06000088KB 543 Views
As is always the case with CES, exciting announcements are made. Today a new standard for Wifi was announced that should be available in 2018. Here's an article on it: Halow Wifi
The new standard focuses on improved range (up to 2x) and lower battery consumption. I have seen some articles focused around larger bandwidth hogs like file transfers, but as we look at IOT solutions, we're going to see a need for efficient connectivity, and exchange of small bursts of information. We could see apps that may have looked at BLE but now can leverage Wifi Halow instead.
Many have also positioned the technology more around the Smarter Home, but it would equally apply to a range of offerings as IOT will drive wide spread connectivity needs.
Lots of possibilities -- another great space to watch.
fletchjibm 06000088KB 833 Views
IBM last week reaffirmed its commitment to Watson and the value that Cognitive Computing will bring to our future. With IOT we're moving to an era of massive data, driven by orders of magnitude reduction in the cost of collecting and maintaining that data. But what do we have when we have lots of data -- well the answer is simple -- "lots of data". With billions of devices potentially connected and interconnected, traditional approaches to leveraging operational data will become unmanageable. Why collect all of this data if we're not going to do something of value with it?
So what's the answer? Well, Cognitive Computing will become a key aspect of the value for IOT. Cognitive Computing will allow us to draw insights that would have not been possible in the past. We'll find the root cause for failures faster, allowing us to further prevent failures through operational insights provided by Cognitive techniques. We'll be able to turn "lots of data" into truly actionable insights, and provide them to technicians in a prescriptive form.
Think about how Watson has transformed the medical field. Medical personnel are now presented with a list of diagnoses with a probability for each based on analysis of the symptoms. What prevents us from applying these same techniques to inanimate "things"? Why not leverage the wealth of data being collected from "things" to feed a Cognitive system which produces not only a list of potential failures with a confidence factor about their possibilities, but more importantly what if a Cognitive system provided insights into maintenance - where maintenance operations become individualized based on operational environments potentially saving millions in unnecessary maintenance, while also eliminating failures.
We're truly at an inflection point in the possibilities and opportunities to revolutionize our future.