The Internet of Things - Rambling Thoughts by Jim Fletcher
Many customers continue to measure temperature of their server racks using the "back of the hand" method. Unfortunately this is exactly what it says -- they walk the aisle with the back of their hand extended and when they feel a warmer than normal area, that is an area to be looked at further. Well, yes, that's not exactly scientific, but it has worked for years. Likewise power consumption was pre-determined from manufacturer's specs which generally means it was grossly overvalued.
But as we look at better optimizing our overall energy consumption, even a degree or two difference can make a big difference in our overall energy efficiency. The "back of the hand" method cannot provide that level of accuracy, so newer methods need to be implemented. Over the last few years, IBM has introduced direct measurement within their server family for both power consumption, as well as temperature reporting. With the direct reporting of this information, immediate and accurate information can be available and leveraged.
With the availability of more accurate information in a timely matter, datacenters can reduce their "energy buffer" . Typically customers have over-cooled, and over-powered. With the ability to detect even small deltas quickly and accurately, these buffers can be reduced and therefore overall energy consumption can be reduced.
But how does one get access to this information? Tivoli's ITM for Energy Management collects this server information from its embedded Active Energy Manager component. The data can be thresholded with events generated automatically when measured values exceed expected values. Reports can be generated or the information can be visualized in an operations console.
Having accurate and detailed information is just one element of an effective overall datacenter energy strategy -- but a very important one for sure.
Well despite months of desire, and a total lack of spare time - I finally become an official blogger today. Thanks to Jeff Jenkins for his help in getting this going.
Over the last two years, as I have pioneered the energy management space for Tivoli, I have seen leading organizations begin to recognize that the historical organizational structure around datacenters does not represent well the needs for improved energy management. Unfortunately, in most datacenters, the team responsible for cooling and power, and the team responsible for IT (servers, applications, storage, etc.) report into different lines of the business. Even more unfortunately from an energy management perspective, neither organization is responsible for paying the power bill, and in most organizations, neither team is even aware of the power bill.
As a result, there is no natural incentive to reduce overall power consumption, unless some external factor like availability of power comes into play. This "green organizational disfunctionality" results in wasted spending on energy, and operational inefficiencies given that there is also limited integration between the multiple organizations responsible for the datacenter. Even when knowledge does exist within the IT organization with regards to power consumption, I have yet to see a datacenter, where the IT team is measured in any way on power consumption - instead, availability and performance are the two measurements that matter.
So how can we expect energy efficient datacenters if organizationally there is little focus, and no incentives are provided to reduce spending on energy, Thats the challenge that organizations need to address. I am seeing an emergence of limited discussions between these multiple teams, and I am seeing an occasional "incentive" from the c-level exec to begin looking at how to reduce energy costs, but only occasionally. Instead, most energy reduction today is coming from "tangential" changes such as virtualization.
For those customers who have made a focus on looking at the entire energy consumption lifecycle, significant cost reductions -- sometimes approaching 40% have been seen.
fletchjibm 06000088KB 519 Views
We really don't think consciously about it, weather impacts every aspect of our life. It impacts our mood, what we wear, where and when we shop, what we eat, when we sleep -- the list is endless ...
So why don't we use weather more in our analytics and insights services to determine the real relationships and how to impact sales, ads, etc ... through the just announced availability of a range of weather APIs in IBM's Bluemix, anyone can now develop apps and services that integrate weather. This is just the start of a continued availability of services and solutions that integrate weather -- predictive, current, and historical - into our solutions.
Here's more info on the new announcement: https://console.ng.bluemix.net/catalog/services/weather-company-data-for-ibm-bluemix/
fletchjibm 06000088KB 682 Views
Yesterday, IBM and Cisco delivered a significant announcement around the Internet of Things. The announcement talked about Analytics at the Edge, but perhaps what is more important was that it set the direction for "Analytics Everywhere". When I look at the Internet of Things and the connectivity and information processing models it will entail, it is a combination of most every model we have seen throughout the history of IT. There will be use cases that require "a mothership" as the control point, and master information source. There are scenarios where much of the processing can be handled "at the edge" with decisions and responses driven in isolation at the edge. There are scenarios that will require peer communications between "things" that are co-located or perhaps located across the globe from each other. There will be "cloud to cloud" information interchange requirements ... and security challenges at levels we have never imagined.
The "Internet of Things" is about business transformation, and the flexibility of being able to effectively and efficiently process information at whatever space is appropriate will create new new opportunities. Yes, we have seen some of these scenarios "in prior lives", but we have never seen the ability to develop and deliver the capabilities at the level of cost effectiveness we can today, nor have we ever considered the level of instrumentation that we can now do. Sensors are inexpensive and they enable a breadth of information that would have been unimaginable only a few years ago. Connectivity is nearly ubiquitous and inexpensive. Data storage costs are at levels where you can afford to keep data you aren't sure you will ever need, but might. And Analytics capabilities are becoming simple to deliver and available - with Watson Analytics being a great example of that evolution. #watsonIOT #IOT
fletchjibm 06000088KB 1,091 Views
I suspect many of you are scratching your heads today, saying "IBM bought the Weather Channel???". Well first off, we bought "The Weather Company" which includes an amazing cloud-based data ingestion and analytics platform, a very successful B2B business that includes a solution that makes all of my airline flights smoother by providing plane to plane awareness of bumpy air, a proven weather prediction capability that has some of the highest accuracy in the world, and a B2C business that includes a smartphone app that is the most used app in existence today.
And when we think of weather, its much more than knowing it will be sunny tomorrow. Its detailed micro-forecasts that allow us to know the specific weather for a specific local at any time of the day. Its the ability to be pre-warned of impending weather events like extreme winds, lightening, or hail storms. Its heat indexes, windchill, dew points, pollen indexes, UV indexes, etc etc etc. All of these elements of weather when coupled with the awareness that the Internet of Things will bring as everything becomes connected created so many new opportunities.
Now lets think about all of this data in a world that has the Cognitive Insight capability that IBM Watson provides and will grow to provide ... the possibilities are truly endless.