The Internet of Things - Smarter Infrastructure - Rambling Thoughts by Jim Fletcher
Last week I was on vacation in Orlando. The place where I stayed was highlighting all of the "greening" they had done. CFL bulbs everywhere -- now sponsoring recycling (even though we had to drive 1/2 mile to recycle 3 aluminum cans). Wow -- I should have felt so good about this place ...
Well -- SHOULD is the right word ... as nightime came, I looked at my entrance door, and on all sides I could see light from the outside- not just a little light either -- so as my air conditioner ran and ran, cooling the Florida landscape, I had to ask -- "is this place really serious about Green"? So while its great to talk about green its time to be serious about BEING green instead ...
Well despite months of desire, and a total lack of spare time - I finally become an official blogger today. Thanks to Jeff Jenkins for his help in getting this going.
Over the last two years, as I have pioneered the energy management space for Tivoli, I have seen leading organizations begin to recognize that the historical organizational structure around datacenters does not represent well the needs for improved energy management. Unfortunately, in most datacenters, the team responsible for cooling and power, and the team responsible for IT (servers, applications, storage, etc.) report into different lines of the business. Even more unfortunately from an energy management perspective, neither organization is responsible for paying the power bill, and in most organizations, neither team is even aware of the power bill.
As a result, there is no natural incentive to reduce overall power consumption, unless some external factor like availability of power comes into play. This "green organizational disfunctionality" results in wasted spending on energy, and operational inefficiencies given that there is also limited integration between the multiple organizations responsible for the datacenter. Even when knowledge does exist within the IT organization with regards to power consumption, I have yet to see a datacenter, where the IT team is measured in any way on power consumption - instead, availability and performance are the two measurements that matter.
So how can we expect energy efficient datacenters if organizationally there is little focus, and no incentives are provided to reduce spending on energy, Thats the challenge that organizations need to address. I am seeing an emergence of limited discussions between these multiple teams, and I am seeing an occasional "incentive" from the c-level exec to begin looking at how to reduce energy costs, but only occasionally. Instead, most energy reduction today is coming from "tangential" changes such as virtualization.
For those customers who have made a focus on looking at the entire energy consumption lifecycle, significant cost reductions -- sometimes approaching 40% have been seen.
Many customers continue to measure temperature of their server racks using the "back of the hand" method. Unfortunately this is exactly what it says -- they walk the aisle with the back of their hand extended and when they feel a warmer than normal area, that is an area to be looked at further. Well, yes, that's not exactly scientific, but it has worked for years. Likewise power consumption was pre-determined from manufacturer's specs which generally means it was grossly overvalued.
But as we look at better optimizing our overall energy consumption, even a degree or two difference can make a big difference in our overall energy efficiency. The "back of the hand" method cannot provide that level of accuracy, so newer methods need to be implemented. Over the last few years, IBM has introduced direct measurement within their server family for both power consumption, as well as temperature reporting. With the direct reporting of this information, immediate and accurate information can be available and leveraged.
With the availability of more accurate information in a timely matter, datacenters can reduce their "energy buffer" . Typically customers have over-cooled, and over-powered. With the ability to detect even small deltas quickly and accurately, these buffers can be reduced and therefore overall energy consumption can be reduced.
But how does one get access to this information? Tivoli's ITM for Energy Management collects this server information from its embedded Active Energy Manager component. The data can be thresholded with events generated automatically when measured values exceed expected values. Reports can be generated or the information can be visualized in an operations console.
Having accurate and detailed information is just one element of an effective overall datacenter energy strategy -- but a very important one for sure.
fletchjibm 06000088KB 677 Views
Over the past couple of months, I have had the opportunity to talk with customers throughout the world. The common theme I see is how little we really have done to make our buildings truly smarter. Whether its lack of awareness, or simply lack of caring, we're wasting abundant amounts of energy, and manpower, as we continue manage buildings the same way we have for decades. We have so much more capability that we can now take advantage of, yet in many cases, our capabilities go untapped.
Here's a short video from the VERGE conference I spoke at in London earlier this year - enjoy -- Smarter Buildings VERGE Event
fletchjibm 06000088KB 290 Views
IBM last week reaffirmed its commitment to Watson and the value that Cognitive Computing will bring to our future. With IOT we're moving to an era of massive data, driven by orders of magnitude reduction in the cost of collecting and maintaining that data. But what do we have when we have lots of data -- well the answer is simple -- "lots of data". With billions of devices potentially connected and interconnected, traditional approaches to leveraging operational data will become unmanageable. Why collect all of this data if we're not going to do something of value with it?
So what's the answer? Well, Cognitive Computing will become a key aspect of the value for IOT. Cognitive Computing will allow us to draw insights that would have not been possible in the past. We'll find the root cause for failures faster, allowing us to further prevent failures through operational insights provided by Cognitive techniques. We'll be able to turn "lots of data" into truly actionable insights, and provide them to technicians in a prescriptive form.
Think about how Watson has transformed the medical field. Medical personnel are now presented with a list of diagnoses with a probability for each based on analysis of the symptoms. What prevents us from applying these same techniques to inanimate "things"? Why not leverage the wealth of data being collected from "things" to feed a Cognitive system which produces not only a list of potential failures with a confidence factor about their possibilities, but more importantly what if a Cognitive system provided insights into maintenance - where maintenance operations become individualized based on operational environments potentially saving millions in unnecessary maintenance, while also eliminating failures.
We're truly at an inflection point in the possibilities and opportunities to revolutionize our future.