The Internet of Things - Rambling Thoughts by Jim Fletcher
Well despite months of desire, and a total lack of spare time - I finally become an official blogger today. Thanks to Jeff Jenkins for his help in getting this going.
Over the last two years, as I have pioneered the energy management space for Tivoli, I have seen leading organizations begin to recognize that the historical organizational structure around datacenters does not represent well the needs for improved energy management. Unfortunately, in most datacenters, the team responsible for cooling and power, and the team responsible for IT (servers, applications, storage, etc.) report into different lines of the business. Even more unfortunately from an energy management perspective, neither organization is responsible for paying the power bill, and in most organizations, neither team is even aware of the power bill.
As a result, there is no natural incentive to reduce overall power consumption, unless some external factor like availability of power comes into play. This "green organizational disfunctionality" results in wasted spending on energy, and operational inefficiencies given that there is also limited integration between the multiple organizations responsible for the datacenter. Even when knowledge does exist within the IT organization with regards to power consumption, I have yet to see a datacenter, where the IT team is measured in any way on power consumption - instead, availability and performance are the two measurements that matter.
So how can we expect energy efficient datacenters if organizationally there is little focus, and no incentives are provided to reduce spending on energy, Thats the challenge that organizations need to address. I am seeing an emergence of limited discussions between these multiple teams, and I am seeing an occasional "incentive" from the c-level exec to begin looking at how to reduce energy costs, but only occasionally. Instead, most energy reduction today is coming from "tangential" changes such as virtualization.
For those customers who have made a focus on looking at the entire energy consumption lifecycle, significant cost reductions -- sometimes approaching 40% have been seen.
Many customers continue to measure temperature of their server racks using the "back of the hand" method. Unfortunately this is exactly what it says -- they walk the aisle with the back of their hand extended and when they feel a warmer than normal area, that is an area to be looked at further. Well, yes, that's not exactly scientific, but it has worked for years. Likewise power consumption was pre-determined from manufacturer's specs which generally means it was grossly overvalued.
But as we look at better optimizing our overall energy consumption, even a degree or two difference can make a big difference in our overall energy efficiency. The "back of the hand" method cannot provide that level of accuracy, so newer methods need to be implemented. Over the last few years, IBM has introduced direct measurement within their server family for both power consumption, as well as temperature reporting. With the direct reporting of this information, immediate and accurate information can be available and leveraged.
With the availability of more accurate information in a timely matter, datacenters can reduce their "energy buffer" . Typically customers have over-cooled, and over-powered. With the ability to detect even small deltas quickly and accurately, these buffers can be reduced and therefore overall energy consumption can be reduced.
But how does one get access to this information? Tivoli's ITM for Energy Management collects this server information from its embedded Active Energy Manager component. The data can be thresholded with events generated automatically when measured values exceed expected values. Reports can be generated or the information can be visualized in an operations console.
Having accurate and detailed information is just one element of an effective overall datacenter energy strategy -- but a very important one for sure.
Last week I was on vacation in Orlando. The place where I stayed was highlighting all of the "greening" they had done. CFL bulbs everywhere -- now sponsoring recycling (even though we had to drive 1/2 mile to recycle 3 aluminum cans). Wow -- I should have felt so good about this place ...
Well -- SHOULD is the right word ... as nightime came, I looked at my entrance door, and on all sides I could see light from the outside- not just a little light either -- so as my air conditioner ran and ran, cooling the Florida landscape, I had to ask -- "is this place really serious about Green"? So while its great to talk about green its time to be serious about BEING green instead ...
As I watch more attention be focused on the "green datacenter", I was amazed that I had not yet seen someone talk about a "Smarter Datacenter". A smarter datacenter would reflect the wide range of improvements that one can make within the datacenter, whether it be improved processes, more effficient equipment, facilities improvements, or virtualization. While many of these areas are not even mentioned under energy management solutions, they are all part of making a datacenter smarter, and a "Smarter Datacenter is a Greener Datacenter".
So as we look at quantifying the energy impact of datacenter improvements, it isnt just about more efficient servers, or improved cooling -- its an aggregation of all we do as we work to improve datacenter efficiency, and thus reduce our overall energy impact.
fletchjibm 06000088KB 1,423 Views
As I look at many of today's buildings, I am somewhat amazed at how "unsmart" many (most??) of them are. Lights glow bright next to southern facing windows on a sunny afternoon -- heat and air conditioning run simultaneously in the same room -- fans blow and blow and blow, for no apparent reason -- thats just to name a few of the many energy wasting items I have seen in just the last week.
Why would any building owner allow that to happen? Well there are many reasons but two key ones are lack of any awareness of the amount of energy being consumed that need not be, and in too many cases, lack of any responsibility for the energy bill therefore no direct real interest in reducing that cost.
But what is changing? We're seeing an interest in instrumenting buildings -- collecting information from seemingly disparate systems and bringing that data together into a consolidated form, and leveraging that data to take actions, and use analytics to predict the future trends. Its not rocket science that is needed -- its basic blocking and tackling - and the financial return and perhaps more importantly the ecological return is something we all should start to care about. Money spent on energy is an expense to a company or individual .... that same money could be used to expand the company, and create new jobs by improving the bottom line ...
fletchjibm 06000088KB 1,785 Views
How quickly things can change -- a year ago it looked like the US would follow Europe and we'd be legislated into a cap and trade economy. There was considerable talk about the implications this would have on prices, etc ... What a difference a year makes .. the change in Washington was followed almost immediately by the collapse of the Chicago Carbon Exchange, an entity in which Al Gore has invested. So what is the future of cap and trade? Short term death is a near certainty but longer term, only time will tell.
The concept of cap and trade is interesting to say the least. The idea being that those that are inefficient in their processes can get credits from those that have found ways to be more efficient -- carbon credits become a currency of sorts -- but the hope is that overall world impact is an improvement in overall emissions and environmental impact.
So lets see what comes next and see how the US reacts as other countries continue on their legislative approach to cap and tax -- I mean cap and trade.
As I watch the Jeopardy match between IBM's Watson and some of Jeopardy's smartest I am quite impressed with the breadth of contextual data munging (that's a technical term) that Watson is able to accomplish and the confidence factor for the results. The ability to draw together seemingly unrelated pieces of information, and determine the context and nuances around those contexts is quite impressive, and sets the stage for what we can expect as we look into our crystal ball for the future.
If one draws parallels between what is being accomplished on Jeopardy, with the future of our Smarter Planet initiatives the role of analytics in pulling together what is seemingly very disparate data and determining correlations, and recommendations from this data will provide a significant level of intelligence around which to make ongoing improvements, or even to drive the development of new technologies -
Now if we could only tell Watson that Toronto is not a US City :-)
As I speak on Smart Grid panels and work with members of our utility companies, I have been somewhat amazed at where the utilities are drawing their line of demarcation. The meter is the end of the world from the utilities perspective and they appear to have no desire to look beyond the meter -- into the home, or commercial building -- to better manage power and to understand more about the specific power usage by the consumer. There are emerging companies that are providing intelligence on the consumer side of the meter, but the utilities continue to stand clear of that space. But why -- well the reasons are many from security concerns around collecting and managing that level of data, to the fact that many utilities are still regulated and therefore have little real incentive to pursue more intelligence at the endpoint.
So what does this mean to the utility? I love parallels and I have to draw a parallel between the smart grid boundaries and what I have seen with my Internet Service Provider. Many years ago, the ISP was one's first point of access whenever one connected to the internet. The ISP provided email, and was the link to all services one wanted on the net. But that is no longer true. I personally never access my ISP's web portal, instead preferring other portals such as Google as my link to the web. My ISP is still there, but they now have no ability to obtain additional value from my connection through them because they have become simply a pipeline to the internet.
It appears the utilities are setting themselves up to become simply a "power provider" with minimal additional values to the consumer. For commercial buildings, solutions such as IBM's Intelligent Building Manager will be well positioned to talk to the "Smart Grid" while providing the intelligence needed to truly make a difference for the consumer. The utilities will continue to drive smart grid, but primarily for the producer side and the benefits that the utilities will see as they modernize the power distribution network. There is immeasurable value in that for the producers, and for consumers in the form of better availability, etc but much of the hype around the smart grid for two way interaction with consumers is just that -- hype ....
Today was the beginning of a new era for IBM - we've been working with our industry partners to improve the energy and operational efficiency of buildings, and today we announced the availability of a bundled software solution that allows us to "listen to the building, and hear what it is telling us". From there we used our analytics to predict problems before they occur, or recognize problems when they occur while providing a mashup-based dashboard to visual the state of the monitored buildings.
Here's some press from the announcement:
IBM Unleashes Advanced Software
Solution for Smarter Buildings
IBM formally introduces its Intelligent Building Management software today -- an advanced solution that's being put to work at Tulane University's School of Architecture, The Cloisters of the Metropolitan Museum of Art in New York, and the company's 35-building facility in Minnesota.
The software is designed to be an analytics and automation powerhouse that can help ramp up the environmental performance of any building, even ones that are 100 years old or more.
The product is the latest in a steady stream of solutions that IBM has unleashed in recent months to make the management of buildings, the energy and resources they use, and the transportation and virtual networks that connect them more efficient, more effective and more intelligent.
The software and its applications, which are being detailed today in an IBM Smarter Buildings Forum in New York, also are the results of the company's steadily increasing collaborative projects, partnerships and acquisitions -- all of which are aimed at positioning IBM as a dominant player in a nascent field that brings together IT, the built environment, vehicles and energy.
Here is an early look at the projects that will be featured during the forum:
While technology advancements in building management systems have made it possible to cull an immense amount of data on structures, the challenge has been to organize, analyze and present it swiftly to building owners and operators so they can proactively manage their properties -- as IBM Smarter Buildings Vice President David Bartlett said at GreenBiz Group's State of Green Business Forum this year.
The new software, which is supposed to be the most comprehensive product thus far in IBM's smarter buildings arsenal, is intended to address that need.
Earlier this week, IBM introduced its
Intelligent Operations Center for Smarter
Cities. The plug-and-play,
smarter-cities-in-a-box solution is expected to deliver high-powered systems and
network management capabilities to communities without the high price tag that
usually affixed to such technology.
fletchjibm 06000088KB 1,287 Views
Over the past couple of months, I have had the opportunity to talk with customers throughout the world. The common theme I see is how little we really have done to make our buildings truly smarter. Whether its lack of awareness, or simply lack of caring, we're wasting abundant amounts of energy, and manpower, as we continue manage buildings the same way we have for decades. We have so much more capability that we can now take advantage of, yet in many cases, our capabilities go untapped.
Here's a short video from the VERGE conference I spoke at in London earlier this year - enjoy -- Smarter Buildings VERGE Event
fletchjibm 06000088KB 1,626 Views
I had a great visit to Mesa Verde Colorado last week. I saw first hand an evolution of Smarter Buildings and how over 800 years ago people were constructing buildings that were energy efficient (OK, efficient for the materials they had). The buildings were constructed initially above ground but over the period of the development, they evolved to buildings that leveraged the Earth's consistent ground temperature by having the buildings dug into the ground initially only a foot or so but later up to 4 feet into the Earth. The advantage was that the consistent ground temperature provided some level of consistency for the overall building temperature. The buildings leveraged "free air cooling" through the use of a hole dug adjacent to the building that tunneled to the lowest level of the pit, providing a fresh air source for the building.
While many believe the Anasazi's lived for centuries in cave dwellings, they only lived there for a few decades -- after the "pit houses". But once again, they leveraged their surroundings to develop buildings that leveraged the Earth's temperature, now combined with water sources, and rock overhangs for the basis of the buildings -- as well as the protection that a cave dwelling offers from an access perspective.
Oh well -- Enough of my rambling for now ...
fletchjibm 06000088KB 1,223 Views
As I sit in Sao Paulo Brazil and look across the city, all I see is a maze of buildings for as far as I can see. Are these buildings "Smarter"? My wager is that most (all?) of them are not. There are many ways to make a building smarter. Most articles I have read over the last few years have been around "energy consumption". While energy consumption is an important aspect for reducing the operational cost of a building, there are many additional ways to reduce costs while improving the satisfaction of their tenants.
Through's the acquisition of Tririga, IBM entered the Integrated Workplace Management System (IWMS) marketspace. IWMS is an industry term for the broader management of all aspects of the building, from financial, portfolio management, to basic operations such as office key management. Now we can manage our portfolio and understand how to improve it. Is a lease about to expire and we have to decide whether to renew it? Are we managing office space as efficiently as possible? And now the building energy management solution that has shown significant success at Tulane University is now part of that same solution family.
So while energy management is definitely important, its not the only way to make your buildings, and building management, smarter -- and Smarter means better --
For some time I have been talking about the need for, and value of the introduction of IT technologies into the operation space. The reality is that the operational space is becoming more IT like in its technologies and implementations. Mechanical devices are being replaced by or augmented with intelligent processors and sensors. They are being connected to (either physically or logically) to networks. These formerly isolated mechanical systems are now becoming connected "IT" endpoints. They are now more susceptible to network attacks. They are now software devices, and will need software updates -- whether for maintenance or for new capabilities. With the replacement of the man with the clipboard, why limit readings to once a day - why not once a second? Why not monitor the systems in near real time (both the endpoint and the collector).
The possibilities are endless and the challenges are many but the reality is that the convergence is happening and will drive a fundamental change in how we architect, operate, and maintain these systems.
Today is a great day for the datacenter. IBM and Emerson have announced a partnership which combines IBM's IT Service Management (ITSM) with Emerson's Trellis offering which was recently recognized as the industry leader in Data Center Infrastructure Management (DCIM). Gartner has said that the DCIM market is an estimated $450 million market today, and expected to grow to $1.7 billion by 2016.
But why all the excitement about this announcement? Anyone that has been in this industry recognizes that the datacenter has often been operated as a series of seemingly disconnected silos. One team manages power distribution, another manages the cooling infrastructure, another manages the physical placement of machines into racks, etc etc etc. When an operational problem occurs, we fall back into that siloed mentality with "twelve people on a bridge call" trying to determine where the problem actually originated. There is little automated "root cause" analysis and even less automated action across the silos. I recently heard of a major customer who had a chiller issue on a Sunday afternoon -- the IT team discovered the issue when the applications began to fail because the servers were failing from overheating.
Why weren't the systems connected? Why didn't a chiller failure signal the IT team and indicate which racks would be impacted and perhaps automate an action to move the workloads or throttle down the servers until the chiller issue was resolved? The answer unfortunately is simply because the operations of the systems were not connected.
With the IBM Emerson partnership, we've establish a base system for interlocking power management, cooling management, and traditional IT management. We're now providing a system which enables automated awareness of each slot in a rack -- what is in the rack? What is its power draw? What applications are running on the rack in that slot? We're now getting information, that can be integrated and leveraged to turn the datacenter into a "smarter datacenter".
Lots of great possibilities ...
On Wednesday of this week, I visited my local Sam's Club and purchased a tub of spinach as I do every week it seems. But this week was unique. When reading my email on Thursday evening, I receive an email from Sam's Club saying:
Dear Sam's Club Member:
Today, we were notified that Taylor Farms has initiated a Recall of its 1 lb Spinach product due to the potential presence of E. coli.
Taylor Farms has asked us to recall any of this product with a “Best if Use By” of 02-24-13.
Our records reflect that you may have purchased the 1 lb Spinach product with a UPC number of 0003022304780 and a “Best if Use By” of 02-24-13.
What could have been a widespread incident instead became a well organized targeted recall. Sam's had not only tracked its supply chain, but also had associated the supply with the consumption. For all of you privacy folks, yes, Sam's did know that purchased spinach and wine on Wednesday, but they also were able to notify me of a potential health threat immediately. Yet another example of what happens as we enable our infrastructure components to communicate.