Today was the beginning of a new era for IBM - we've been working with our industry partners to improve the energy and operational efficiency of buildings, and today we announced the availability of a bundled software solution that allows us to "listen to the building, and hear what it is telling us". From there we used our analytics to predict problems before they occur, or recognize problems when they occur while providing a mashup-based dashboard to visual the state of the monitored buildings.
Here's some press from the announcement:
IBM Unleashes Advanced Software
Solution for Smarter Buildings
formally introduces its Intelligent Building Management software today -- an
advanced solution that's being put to work at Tulane University's School of
Architecture, The Cloisters of the Metropolitan Museum of Art in New York, and
the company's 35-building facility in Minnesota.
The software is designed to be an
analytics and automation powerhouse that can help ramp up the environmental
performance of any building, even ones that are 100 years old or more.
The product is the latest in a steady stream of
solutions that IBM has
unleashed in recent months to make the management of buildings, the energy and resources they use, and the
transportation and virtual networks that connect them more efficient, more effective and
The software and its applications, which
are being detailed today in an IBM
Smarter Buildings Forum in
New York, also are the results of the company's steadily increasing collaborative projects, partnerships and acquisitions -- all of which are aimed at positioning IBM as a
dominant player in a nascent field
that brings together IT, the built environment, vehicles and
GreenerBuildings.com Executive Editor Rob Watson is the kickoff speaker at the forum today and GreenBiz.com Senior Writer Adam Aston will provide on-scene coverage of the event.
Here is an early look at the projects
that will be featured during the forum:
buildings can help owners and operators cut energy use by as much as 40 percent
and reduce maintenance costs by 10 to 30 percent, according to IBM.
- Tulane University in New Orleans is using the software to transform the
historic century-old Richardson Memorial Hall (pictured above), home of the
Tulane School of
Architecture, into what IBM
is calling a "smarter-building living laboratory." Johnson Controls is a partner
in the project.
Cloisters, the branch of the
Metropolitan Museum that houses 3,000 works of European medieval art, is using
IBM software and its wireless environmental sensor network called the
Lower-Power Mote to preserve the collection.
- IBM's facility in Rochester, Minn., is realizing further energy savings using the
Intelligent Building Management system. The complex of 35 interconnected
buildings that make up the 3.2-million-square-foot manufacturing and development
facility has undergone several waves of efficiency improvements since the site
opened in 1956 with a half million square feet of workspace. The company said
it's achieving further year-over-year incremental energy savings and as well as
saving in equipment operating costs with the
While technology advancements in building
management systems have made it possible to cull an immense amount of data on
structures, the challenge has been to organize, analyze and present it swiftly
to building owners and operators so they can proactively manage their properties
-- as IBM Smarter Buildings Vice President David Bartlett said at GreenBiz Group's State of Green Business
Forum this year.
The new software, which is supposed to be
the most comprehensive product thus far in IBM's smarter buildings arsenal, is
intended to address that need.
Earlier this week, IBM introduced its
Intelligent Operations Center for Smarter
Cities. The plug-and-play,
smarter-cities-in-a-box solution is expected to deliver high-powered systems and
network management capabilities to communities without the high price tag that
usually affixed to such technology.
No this isn't an answer from Jeopardy ... the recent announcement between IBM and Apple is a key indicator of where the enterprise world is headed. We're moving from desktop fixed location systems to mobile systems. When we asked one of our transportation customers what they saw as their preferred device in the future, they indicated that tablets (aka iPads) will be the norm. But why? This is where the Internet of Things ties into the picture.
Much of the operational information we collect today (when we do collect it) is manual, delayed, or disconnected from the the management system. As we move to a "everything is connected" world, we'll now have the data to drive analytics to recommend and drive actionable information to these devices so that actions can be taken before problems ever happen. The ability to aggregate information from a range of systems, easily, and intuitively and make that information available to the appropriate person in near real time.
So Apple's devices coupled with IBM's Enterprise and Big Data skills, coupled with the emergence of the Internet of Things ability to collect data from "any thing" is a great combination ....
Last week I was on vacation in Orlando. The place where I stayed was highlighting all of the "greening" they had done. CFL bulbs everywhere -- now sponsoring recycling (even though we had to drive 1/2 mile to recycle 3 aluminum cans). Wow -- I should have felt so good about this place ...
Well -- SHOULD is the right word ... as nightime came, I looked at my entrance door, and on all sides I could see light from the outside- not just a little light either -- so as my air conditioner ran and ran, cooling the Florida landscape, I had to ask -- "is this place really serious about Green"? So while its great to talk about green its time to be serious about BEING green instead ...
Nothing like titling a blog entry with three TLAs (thats three three letter acronyms). But when I look at technologies driving change, these two will drive a 180 degree change in how disciplines like Smarter Buildings and Enterprise Asset Management evolve.
Building Information Modeling (BIM) allows us to define, operate upon, and visualize "anything" ... and visualize it in context of everything around it. The Internet of Things (IOT) will allow us to collect real time operational information about everything ... and aggregate that information so that I understand the relationship between the "things" and derive patterns about how the things interoperate.
So now what do I have -- by integrating the ability to accurately visualize, drill in, rotate, look inside, etc etc etc and to understand how things are operating now and in the past, and predicting how they will operate in the future without ever having to physically touch the "thing" we're enabling an infinite realm of possibilities to improve operations, improve health and safety, and forever change how we do our jobs ...
Exciting times to come ....
I had the opportunity to speak at the 59th annual European Union of Electrical Wholesalers conference in Copenhagen Denmark. While I was there to talk about Smarter Buildings and to point out that a Smarter Building is determined by much more than just direct energy consumption. Overall operational efficiency, effective lease management, effective space management, project management, and maintenance procedures all can impact the "indirect" energy cost of the building. The message rang well with the audience and they appeared to agree with my thoughts.
But perhaps what was more intriguing was the parallel changes that need to occur in their industry as are occurring in the IT industry. They are moving from a model of "sell a component" to a model where they need to establish and maintain a long term relationship with the client through value-add services. The "things" themselves will evolve substantially as they take on intelligence, become connected, and become "alive" through new functionality provided through software/firmware updates to the object whether it be something as simple as a light, or as complex as an HVAC system.
Gartner says "The Internet of Things ("IoT") is expected to grow to 26 billion things by the year 2020, representing an almost 30x increase over the 900 million things connected in 2009. " While many of these "things" will be consumer devices, there will be a large number that are evolving building elements. The ability to connect and dynamically change has the potential to totally change how wiring is done for example, with the introduction of "wireless network connected" switches instead of traditional hardwired approaches used today.
The parallel to IT comes in the movement to continuing services and adaptation of systems. There will be considerable opportunity to manage, update, and improve efficiency through the acquisition of data from these "things", applying domain specific analytics, and then driving change dynamically. Truly intelligent -- and somewhat living "things" ... for IT, this means continuous delivery and Software as a Service -- for "thing" developers, this means making "things" have the ability to "become smart" and then driving continual change into those "things" as part of an evolving business model.
Well despite months of desire, and a total lack of spare time - I finally become an official blogger today. Thanks to Jeff Jenkins for his help in getting this going.
Over the last two years, as I have pioneered the energy management space for Tivoli, I have seen leading organizations begin to recognize that the historical organizational structure around datacenters does not represent well the needs for improved energy management. Unfortunately, in most datacenters, the team responsible for cooling and power, and the team responsible for IT (servers, applications, storage, etc.) report into different lines of the business. Even more unfortunately from an energy management perspective, neither organization is responsible for paying the power bill, and in most organizations, neither team is even aware of the power bill.
As a result, there is no natural incentive to reduce overall power consumption, unless some external factor like availability of power comes into play. This "green organizational disfunctionality" results in wasted spending on energy, and operational inefficiencies given that there is also limited integration between the multiple organizations responsible for the datacenter. Even when knowledge does exist within the IT organization with regards to power consumption, I have yet to see a datacenter, where the IT team is measured in any way on power consumption - instead, availability and performance are the two measurements that matter.
So how can we expect energy efficient datacenters if organizationally there is little focus, and no incentives are provided to reduce spending on energy, Thats the challenge that organizations need to address. I am seeing an emergence of limited discussions between these multiple teams, and I am seeing an occasional "incentive" from the c-level exec to begin looking at how to reduce energy costs, but only occasionally. Instead, most energy reduction today is coming from "tangential" changes such as virtualization.
For those customers who have made a focus on looking at the entire energy consumption lifecycle, significant cost reductions -- sometimes approaching 40% have been seen.
As I am sure many of you have heard by now, IBM is making a major investment in the Internet of Things (IOT). On Thursday April 9th, IBM will have a web streamed event around our announcement and what it will mean to our customers and the industry. Here's a link to the live event: IBM IOT Event Link.
With its strong capabilities in streaming and historical analytics, its IOT Foundation for data collection, its breadth of database capabilities, and its Bluemix and Cloud solutions IBM is well positioned as a major player for the Internet of Things. Couple that with the industry expertise that IBM has built over the years in areas such as Asset Management with Maximo, Smarter Cities, and Smarter Buildings. As real time information will become available, so many aspects of how our clients do their jobs will forever change.
And the change will start with the engineering. Engineering and operations will digitally integrate. Rich visualizations, coupled with augmented reality capabilities will improve safety, reduce maintenance, and enable workers to better perform their roles.
Its an exciting time, and I am so glad to be in the middle of the IOT revolution.
Building Information Modeling (BIM) is changing how "things" are designed. We're moving away from 2D real and digital paper to 3D designs that we can look at, change, and "virtually construct". Its unfortunate that BIM is called "building" because so many people think of it as only for "buildings", but it really applies to any "thing" we want to construct. BIM can be used to represent the finished building, but can equally be used to decompose a complex asset like a pump to virtually "look inside".
Even Wikipedia errs on the side of "buildings" with its definition of BIM saying:
Building Information Modeling (BIM) is a process involving the generation and management of digital representations of physical and functional characteristics of places. Building Information Models (BIMs) are files (often but not always in proprietary formats and containing proprietary data) which can be exchanged or networked to support decision-making about a place. Current BIM software is used by individuals, businesses and government authorities who plan, design, construct, operate and maintain diverse physical infrastructures, from water, wastewater, electricity, gas, refuse and communication utilities to roads, bridges and ports, from houses, apartments, schools and shops to offices, factories, warehouses and prisons, etc.
With the ability to digitally represent assets and even their subcomponents, the process of "handover" or "commissioning" is now able to change from one of complex manual (and error prone) tasks to one of automated integration of the design and operations phases. In fact, if we approached this the way it should be, we'd actually integrate the assets into the asset management system as they come online, and at commissioning the operator would have full knowledge of the asset history and maintenance. We'd leverage the BIM models created to construct the building, maintain them during operations, and integrate them into the operations process.
We've enabled the import of the BIM model into Maximo and the ability to incrementally update the information as changes occur. Its now possible to leverage the BIM model throughout the lifecycle of the asset. We can visualize the asset in 3D, drill into it, rotate it, look "inside" all through the power of BIM integration.
Modified by fletchjibm
The non-tech media has finally caught wind of the Internet of Things (IOT) and is highlighting what happens when you don't secure your solution. They are showing cars that have their braking and steering systems controlled remotely. They are talking about home automation systems being hacked with lights and heating being remotely controlled by the hacker. They are talking about home appliances being turned on and off without the homeowner intervening. Just today USA Today featured an article on Congress wanting to hold special workgroups around the area of IOT security -- USA Today Article
So while this makes for great news stories, the need for security in any connected solution has always been critical. In this modern connected world, the ability to hack in and take control will happen unless we do something to prevent it. Any IOT solution requires security at the connection level to assure that the control "pipe" is not hijacked and used for unwanted purposes. IOT security is also needed at "the thing" to assure that it cannot be taken over and controlled by an undesired source, such as a hacker. A good IOT solution would also place bounds on what "the thing" is allowed to do and actions when "things" are controlled outside of normal bounds. For example, who would preheat their oven for 8 hours -- the "smart oven" would reset itself if preheated for too long.
But there is also a new class of security needed for many of those "things". Location - is the "thing" where you thought it was. Has it been moved to a different location, so that the stop light you though you were managing is now no longer where you thought it was. It might sound far fetched but think of the implications?
The net is that we have to be concerned with security with any intelligent and connected solution. Whether being hacked with a USB device introducing malware, hijacking the device over the network, or simply signing in with a weak password (perhaps the biggest concern) the impact is real and we have to assure whatever we build, that security is a major focus of the solution.
IBM last week reaffirmed its commitment to Watson and the value that Cognitive Computing will bring to our future. With IOT we're moving to an era of massive data, driven by orders of magnitude reduction in the cost of collecting and maintaining that data. But what do we have when we have lots of data -- well the answer is simple -- "lots of data". With billions of devices potentially connected and interconnected, traditional approaches to leveraging operational data will become unmanageable. Why collect all of this data if we're not going to do something of value with it?
So what's the answer? Well, Cognitive Computing will become a key aspect of the value for IOT. Cognitive Computing will allow us to draw insights that would have not been possible in the past. We'll find the root cause for failures faster, allowing us to further prevent failures through operational insights provided by Cognitive techniques. We'll be able to turn "lots of data" into truly actionable insights, and provide them to technicians in a prescriptive form.
Think about how Watson has transformed the medical field. Medical personnel are now presented with a list of diagnoses with a probability for each based on analysis of the symptoms. What prevents us from applying these same techniques to inanimate "things"? Why not leverage the wealth of data being collected from "things" to feed a Cognitive system which produces not only a list of potential failures with a confidence factor about their possibilities, but more importantly what if a Cognitive system provided insights into maintenance - where maintenance operations become individualized based on operational environments potentially saving millions in unnecessary maintenance, while also eliminating failures.
We're truly at an inflection point in the possibilities and opportunities to revolutionize our future.
As I speak on Smart Grid panels and work with members of our utility companies, I have been somewhat amazed at where the utilities are drawing their line of demarcation. The meter is the end of the world from the utilities perspective and they appear to have no desire to look beyond the meter -- into the home, or commercial building -- to better manage power and to understand more about the specific power usage by the consumer. There are emerging companies that are providing intelligence on the consumer side of the meter, but the utilities continue to stand clear of that space. But why -- well the reasons are many from security concerns around collecting and managing that level of data, to the fact that many utilities are still regulated and therefore have little real incentive to pursue more intelligence at the endpoint.
So what does this mean to the utility? I love parallels and I have to draw a parallel between the smart grid boundaries and what I have seen with my Internet Service Provider. Many years ago, the ISP was one's first point of access whenever one connected to the internet. The ISP provided email, and was the link to all services one wanted on the net. But that is no longer true. I personally never access my ISP's web portal, instead preferring other portals such as Google as my link to the web. My ISP is still there, but they now have no ability to obtain additional value from my connection through them because they have become simply a pipeline to the internet.
It appears the utilities are setting themselves up to become simply a "power provider" with minimal additional values to the consumer. For commercial buildings, solutions such as IBM's Intelligent Building Manager will be well positioned to talk to the "Smart Grid" while providing the intelligence needed to truly make a difference for the consumer. The utilities will continue to drive smart grid, but primarily for the producer side and the benefits that the utilities will see as they modernize the power distribution network. There is immeasurable value in that for the producers, and for consumers in the form of better availability, etc but much of the hype around the smart grid for two way interaction with consumers is just that -- hype ....
As I watch the Jeopardy match between IBM's Watson and some of Jeopardy's smartest I am quite impressed with the breadth of contextual data munging (that's a technical term) that Watson is able to accomplish and the confidence factor for the results. The ability to draw together seemingly unrelated pieces of information, and determine the context and nuances around those contexts is quite impressive, and sets the stage for what we can expect as we look into our crystal ball for the future.
If one draws parallels between what is being accomplished on Jeopardy, with the future of our Smarter Planet initiatives the role of analytics in pulling together what is seemingly very disparate data and determining correlations, and recommendations from this data will provide a significant level of intelligence around which to make ongoing improvements, or even to drive the development of new technologies -
Now if we could only tell Watson that Toronto is not a US City :-)
I found the CES announcement from GM and Audi that they are going to provide LTE connectivity in the cars to be very interesting. If you look back 15 years ago, GM provided its OnStar service leveraging analog cell service that was "embedded" in the car. Some of you may remember a few years ago, when the cellular carriers decided to sunset their analog offerings, and GM (and more importantly its customers) were stuck with mobile connections that no longer functioned. Given the lifecycle for a car versus the rapid evolution of mobile services, speeds, technologies, and protocols, why aren't we going to see this "technology sunsetting" issue repeat? How will consumers have flexibility in pricing and in carrier selection if the auto manufacturer controls that selection?
The model of tethered connectivity leveraging one's personal device connected to the car via Bluetooth has shown rapid success and acceptance. It provides the ability for consumers to have choice in providers, plans, and capabilities. What it doesn't provide is "lock in" to a specific carrier for the car manufacturer or carrier ...
It will be interesting to see how this all plays out .... my bet is that a lock in model will fail -- only time will tell ...
Many customers continue to measure temperature of their server racks using the "back of the hand" method. Unfortunately this is exactly what it says -- they walk the aisle with the back of their hand extended and when they feel a warmer than normal area, that is an area to be looked at further. Well, yes, that's not exactly scientific, but it has worked for years. Likewise power consumption was pre-determined from manufacturer's specs which generally means it was grossly overvalued.
But as we look at better optimizing our overall energy consumption, even a degree or two difference can make a big difference in our overall energy efficiency. The "back of the hand" method cannot provide that level of accuracy, so newer methods need to be implemented. Over the last few years, IBM has introduced direct measurement within their server family for both power consumption, as well as temperature reporting. With the direct reporting of this information, immediate and accurate information can be available and leveraged.
With the availability of more accurate information in a timely matter, datacenters can reduce their "energy buffer" . Typically customers have over-cooled, and over-powered. With the ability to detect even small deltas quickly and accurately, these buffers can be reduced and therefore overall energy consumption can be reduced.
But how does one get access to this information? Tivoli's ITM for Energy Management collects this server information from its embedded Active Energy Manager component. The data can be thresholded with events generated automatically when measured values exceed expected values. Reports can be generated or the information can be visualized in an operations console.
Having accurate and detailed information is just one element of an effective overall datacenter energy strategy -- but a very important one for sure.
For some time I have been talking about the need for, and value of the introduction of IT technologies into the operation space. The reality is that the operational space is becoming more IT like in its technologies and implementations. Mechanical devices are being replaced by or augmented with intelligent processors and sensors. They are being connected to (either physically or logically) to networks. These formerly isolated mechanical systems are now becoming connected "IT" endpoints. They are now more susceptible to network attacks. They are now software devices, and will need software updates -- whether for maintenance or for new capabilities. With the replacement of the man with the clipboard, why limit readings to once a day - why not once a second? Why not monitor the systems in near real time (both the endpoint and the collector).
The possibilities are endless and the challenges are many but the reality is that the convergence is happening and will drive a fundamental change in how we architect, operate, and maintain these systems.