The Internet of Things - Rambling Thoughts by Jim Fletcher
Well despite months of desire, and a total lack of spare time - I finally become an official blogger today. Thanks to Jeff Jenkins for his help in getting this going.
Over the last two years, as I have pioneered the energy management space for Tivoli, I have seen leading organizations begin to recognize that the historical organizational structure around datacenters does not represent well the needs for improved energy management. Unfortunately, in most datacenters, the team responsible for cooling and power, and the team responsible for IT (servers, applications, storage, etc.) report into different lines of the business. Even more unfortunately from an energy management perspective, neither organization is responsible for paying the power bill, and in most organizations, neither team is even aware of the power bill.
As a result, there is no natural incentive to reduce overall power consumption, unless some external factor like availability of power comes into play. This "green organizational disfunctionality" results in wasted spending on energy, and operational inefficiencies given that there is also limited integration between the multiple organizations responsible for the datacenter. Even when knowledge does exist within the IT organization with regards to power consumption, I have yet to see a datacenter, where the IT team is measured in any way on power consumption - instead, availability and performance are the two measurements that matter.
So how can we expect energy efficient datacenters if organizationally there is little focus, and no incentives are provided to reduce spending on energy, Thats the challenge that organizations need to address. I am seeing an emergence of limited discussions between these multiple teams, and I am seeing an occasional "incentive" from the c-level exec to begin looking at how to reduce energy costs, but only occasionally. Instead, most energy reduction today is coming from "tangential" changes such as virtualization.
For those customers who have made a focus on looking at the entire energy consumption lifecycle, significant cost reductions -- sometimes approaching 40% have been seen.
Many customers continue to measure temperature of their server racks using the "back of the hand" method. Unfortunately this is exactly what it says -- they walk the aisle with the back of their hand extended and when they feel a warmer than normal area, that is an area to be looked at further. Well, yes, that's not exactly scientific, but it has worked for years. Likewise power consumption was pre-determined from manufacturer's specs which generally means it was grossly overvalued.
But as we look at better optimizing our overall energy consumption, even a degree or two difference can make a big difference in our overall energy efficiency. The "back of the hand" method cannot provide that level of accuracy, so newer methods need to be implemented. Over the last few years, IBM has introduced direct measurement within their server family for both power consumption, as well as temperature reporting. With the direct reporting of this information, immediate and accurate information can be available and leveraged.
With the availability of more accurate information in a timely matter, datacenters can reduce their "energy buffer" . Typically customers have over-cooled, and over-powered. With the ability to detect even small deltas quickly and accurately, these buffers can be reduced and therefore overall energy consumption can be reduced.
But how does one get access to this information? Tivoli's ITM for Energy Management collects this server information from its embedded Active Energy Manager component. The data can be thresholded with events generated automatically when measured values exceed expected values. Reports can be generated or the information can be visualized in an operations console.
Having accurate and detailed information is just one element of an effective overall datacenter energy strategy -- but a very important one for sure.
As I watch more attention be focused on the "green datacenter", I was amazed that I had not yet seen someone talk about a "Smarter Datacenter". A smarter datacenter would reflect the wide range of improvements that one can make within the datacenter, whether it be improved processes, more effficient equipment, facilities improvements, or virtualization. While many of these areas are not even mentioned under energy management solutions, they are all part of making a datacenter smarter, and a "Smarter Datacenter is a Greener Datacenter".
So as we look at quantifying the energy impact of datacenter improvements, it isnt just about more efficient servers, or improved cooling -- its an aggregation of all we do as we work to improve datacenter efficiency, and thus reduce our overall energy impact.
Today was the beginning of a new era for IBM - we've been working with our industry partners to improve the energy and operational efficiency of buildings, and today we announced the availability of a bundled software solution that allows us to "listen to the building, and hear what it is telling us". From there we used our analytics to predict problems before they occur, or recognize problems when they occur while providing a mashup-based dashboard to visual the state of the monitored buildings.
Here's some press from the announcement:
IBM Unleashes Advanced Software
Solution for Smarter Buildings
IBM formally introduces its Intelligent Building Management software today -- an advanced solution that's being put to work at Tulane University's School of Architecture, The Cloisters of the Metropolitan Museum of Art in New York, and the company's 35-building facility in Minnesota.
The software is designed to be an analytics and automation powerhouse that can help ramp up the environmental performance of any building, even ones that are 100 years old or more.
The product is the latest in a steady stream of solutions that IBM has unleashed in recent months to make the management of buildings, the energy and resources they use, and the transportation and virtual networks that connect them more efficient, more effective and more intelligent.
The software and its applications, which are being detailed today in an IBM Smarter Buildings Forum in New York, also are the results of the company's steadily increasing collaborative projects, partnerships and acquisitions -- all of which are aimed at positioning IBM as a dominant player in a nascent field that brings together IT, the built environment, vehicles and energy.
Here is an early look at the projects that will be featured during the forum:
While technology advancements in building management systems have made it possible to cull an immense amount of data on structures, the challenge has been to organize, analyze and present it swiftly to building owners and operators so they can proactively manage their properties -- as IBM Smarter Buildings Vice President David Bartlett said at GreenBiz Group's State of Green Business Forum this year.
The new software, which is supposed to be the most comprehensive product thus far in IBM's smarter buildings arsenal, is intended to address that need.
Earlier this week, IBM introduced its
Intelligent Operations Center for Smarter
Cities. The plug-and-play,
smarter-cities-in-a-box solution is expected to deliver high-powered systems and
network management capabilities to communities without the high price tag that
usually affixed to such technology.
Today is a great day for the datacenter. IBM and Emerson have announced a partnership which combines IBM's IT Service Management (ITSM) with Emerson's Trellis offering which was recently recognized as the industry leader in Data Center Infrastructure Management (DCIM). Gartner has said that the DCIM market is an estimated $450 million market today, and expected to grow to $1.7 billion by 2016.
But why all the excitement about this announcement? Anyone that has been in this industry recognizes that the datacenter has often been operated as a series of seemingly disconnected silos. One team manages power distribution, another manages the cooling infrastructure, another manages the physical placement of machines into racks, etc etc etc. When an operational problem occurs, we fall back into that siloed mentality with "twelve people on a bridge call" trying to determine where the problem actually originated. There is little automated "root cause" analysis and even less automated action across the silos. I recently heard of a major customer who had a chiller issue on a Sunday afternoon -- the IT team discovered the issue when the applications began to fail because the servers were failing from overheating.
Why weren't the systems connected? Why didn't a chiller failure signal the IT team and indicate which racks would be impacted and perhaps automate an action to move the workloads or throttle down the servers until the chiller issue was resolved? The answer unfortunately is simply because the operations of the systems were not connected.
With the IBM Emerson partnership, we've establish a base system for interlocking power management, cooling management, and traditional IT management. We're now providing a system which enables automated awareness of each slot in a rack -- what is in the rack? What is its power draw? What applications are running on the rack in that slot? We're now getting information, that can be integrated and leveraged to turn the datacenter into a "smarter datacenter".
Lots of great possibilities ...
Last week I was on vacation in Orlando. The place where I stayed was highlighting all of the "greening" they had done. CFL bulbs everywhere -- now sponsoring recycling (even though we had to drive 1/2 mile to recycle 3 aluminum cans). Wow -- I should have felt so good about this place ...
Well -- SHOULD is the right word ... as nightime came, I looked at my entrance door, and on all sides I could see light from the outside- not just a little light either -- so as my air conditioner ran and ran, cooling the Florida landscape, I had to ask -- "is this place really serious about Green"? So while its great to talk about green its time to be serious about BEING green instead ...
As I speak on Smart Grid panels and work with members of our utility companies, I have been somewhat amazed at where the utilities are drawing their line of demarcation. The meter is the end of the world from the utilities perspective and they appear to have no desire to look beyond the meter -- into the home, or commercial building -- to better manage power and to understand more about the specific power usage by the consumer. There are emerging companies that are providing intelligence on the consumer side of the meter, but the utilities continue to stand clear of that space. But why -- well the reasons are many from security concerns around collecting and managing that level of data, to the fact that many utilities are still regulated and therefore have little real incentive to pursue more intelligence at the endpoint.
So what does this mean to the utility? I love parallels and I have to draw a parallel between the smart grid boundaries and what I have seen with my Internet Service Provider. Many years ago, the ISP was one's first point of access whenever one connected to the internet. The ISP provided email, and was the link to all services one wanted on the net. But that is no longer true. I personally never access my ISP's web portal, instead preferring other portals such as Google as my link to the web. My ISP is still there, but they now have no ability to obtain additional value from my connection through them because they have become simply a pipeline to the internet.
It appears the utilities are setting themselves up to become simply a "power provider" with minimal additional values to the consumer. For commercial buildings, solutions such as IBM's Intelligent Building Manager will be well positioned to talk to the "Smart Grid" while providing the intelligence needed to truly make a difference for the consumer. The utilities will continue to drive smart grid, but primarily for the producer side and the benefits that the utilities will see as they modernize the power distribution network. There is immeasurable value in that for the producers, and for consumers in the form of better availability, etc but much of the hype around the smart grid for two way interaction with consumers is just that -- hype ....
On Wednesday of this week, I visited my local Sam's Club and purchased a tub of spinach as I do every week it seems. But this week was unique. When reading my email on Thursday evening, I receive an email from Sam's Club saying:
Dear Sam's Club Member:
Today, we were notified that Taylor Farms has initiated a Recall of its 1 lb Spinach product due to the potential presence of E. coli.
Taylor Farms has asked us to recall any of this product with a “Best if Use By” of 02-24-13.
Our records reflect that you may have purchased the 1 lb Spinach product with a UPC number of 0003022304780 and a “Best if Use By” of 02-24-13.
What could have been a widespread incident instead became a well organized targeted recall. Sam's had not only tracked its supply chain, but also had associated the supply with the consumption. For all of you privacy folks, yes, Sam's did know that purchased spinach and wine on Wednesday, but they also were able to notify me of a potential health threat immediately. Yet another example of what happens as we enable our infrastructure components to communicate.
For some time I have been talking about the need for, and value of the introduction of IT technologies into the operation space. The reality is that the operational space is becoming more IT like in its technologies and implementations. Mechanical devices are being replaced by or augmented with intelligent processors and sensors. They are being connected to (either physically or logically) to networks. These formerly isolated mechanical systems are now becoming connected "IT" endpoints. They are now more susceptible to network attacks. They are now software devices, and will need software updates -- whether for maintenance or for new capabilities. With the replacement of the man with the clipboard, why limit readings to once a day - why not once a second? Why not monitor the systems in near real time (both the endpoint and the collector).
The possibilities are endless and the challenges are many but the reality is that the convergence is happening and will drive a fundamental change in how we architect, operate, and maintain these systems.
I recently did a presentation at our Pulse Conference on Smarter Infrastructure -- what is it and how it will impact us.
The world is fast becoming smarter -- sensors everywhere, huge masses of new data sources, the need for rapid, yet informed decision making, optimized processes -- the list goes on and on. How will we make this happen? How will me maximize our return? How will we turn these masses of new data into actionable information? Regardless of the industry you are in, regardless of whether you are public or private sector, the evolution to a Smarter Infrastructure is real and will impact all aspects of our lives.
Here's a link to the video:
fletchjibm 06000088KB 1.670 Visualizações
Building Information Modeling (BIM) is changing how "things" are designed. We're moving away from 2D real and digital paper to 3D designs that we can look at, change, and "virtually construct". Its unfortunate that BIM is called "building" because so many people think of it as only for "buildings", but it really applies to any "thing" we want to construct. BIM can be used to represent the finished building, but can equally be used to decompose a complex asset like a pump to virtually "look inside".
Even Wikipedia errs on the side of "buildings" with its definition of BIM saying:
Building Information Modeling (BIM) is a process involving the generation and management of digital representations of physical and functional characteristics of places. Building Information Models (BIMs) are files (often but not always in proprietary formats and containing proprietary data) which can be exchanged or networked to support decision-making about a place. Current BIM software is used by individuals, businesses and government authorities who plan, design, construct, operate and maintain diverse physical infrastructures, from water, wastewater, electricity, gas, refuse and communication utilities to roads, bridges and ports, from houses, apartments, schools and shops to offices, factories, warehouses and prisons, etc.
With the ability to digitally represent assets and even their subcomponents, the process of "handover" or "commissioning" is now able to change from one of complex manual (and error prone) tasks to one of automated integration of the design and operations phases. In fact, if we approached this the way it should be, we'd actually integrate the assets into the asset management system as they come online, and at commissioning the operator would have full knowledge of the asset history and maintenance. We'd leverage the BIM models created to construct the building, maintain them during operations, and integrate them into the operations process.
We've enabled the import of the BIM model into Maximo and the ability to incrementally update the information as changes occur. Its now possible to leverage the BIM model throughout the lifecycle of the asset. We can visualize the asset in 3D, drill into it, rotate it, look "inside" all through the power of BIM integration.
fletchjibm 06000088KB 1.610 Visualizações
Nothing like titling a blog entry with three TLAs (thats three three letter acronyms). But when I look at technologies driving change, these two will drive a 180 degree change in how disciplines like Smarter Buildings and Enterprise Asset Management evolve.
Building Information Modeling (BIM) allows us to define, operate upon, and visualize "anything" ... and visualize it in context of everything around it. The Internet of Things (IOT) will allow us to collect real time operational information about everything ... and aggregate that information so that I understand the relationship between the "things" and derive patterns about how the things interoperate.
So now what do I have -- by integrating the ability to accurately visualize, drill in, rotate, look inside, etc etc etc and to understand how things are operating now and in the past, and predicting how they will operate in the future without ever having to physically touch the "thing" we're enabling an infinite realm of possibilities to improve operations, improve health and safety, and forever change how we do our jobs ...
Exciting times to come ....
fletchjibm 06000088KB 1.417 Visualizações
How quickly things can change -- a year ago it looked like the US would follow Europe and we'd be legislated into a cap and trade economy. There was considerable talk about the implications this would have on prices, etc ... What a difference a year makes .. the change in Washington was followed almost immediately by the collapse of the Chicago Carbon Exchange, an entity in which Al Gore has invested. So what is the future of cap and trade? Short term death is a near certainty but longer term, only time will tell.
The concept of cap and trade is interesting to say the least. The idea being that those that are inefficient in their processes can get credits from those that have found ways to be more efficient -- carbon credits become a currency of sorts -- but the hope is that overall world impact is an improvement in overall emissions and environmental impact.
So lets see what comes next and see how the US reacts as other countries continue on their legislative approach to cap and tax -- I mean cap and trade.
As I watch the Jeopardy match between IBM's Watson and some of Jeopardy's smartest I am quite impressed with the breadth of contextual data munging (that's a technical term) that Watson is able to accomplish and the confidence factor for the results. The ability to draw together seemingly unrelated pieces of information, and determine the context and nuances around those contexts is quite impressive, and sets the stage for what we can expect as we look into our crystal ball for the future.
If one draws parallels between what is being accomplished on Jeopardy, with the future of our Smarter Planet initiatives the role of analytics in pulling together what is seemingly very disparate data and determining correlations, and recommendations from this data will provide a significant level of intelligence around which to make ongoing improvements, or even to drive the development of new technologies -
Now if we could only tell Watson that Toronto is not a US City :-)
fletchjibm 06000088KB 1.325 Visualizações
I had a great visit to Mesa Verde Colorado last week. I saw first hand an evolution of Smarter Buildings and how over 800 years ago people were constructing buildings that were energy efficient (OK, efficient for the materials they had). The buildings were constructed initially above ground but over the period of the development, they evolved to buildings that leveraged the Earth's consistent ground temperature by having the buildings dug into the ground initially only a foot or so but later up to 4 feet into the Earth. The advantage was that the consistent ground temperature provided some level of consistency for the overall building temperature. The buildings leveraged "free air cooling" through the use of a hole dug adjacent to the building that tunneled to the lowest level of the pit, providing a fresh air source for the building.
While many believe the Anasazi's lived for centuries in cave dwellings, they only lived there for a few decades -- after the "pit houses". But once again, they leveraged their surroundings to develop buildings that leveraged the Earth's temperature, now combined with water sources, and rock overhangs for the basis of the buildings -- as well as the protection that a cave dwelling offers from an access perspective.
Oh well -- Enough of my rambling for now ...