We really don't think consciously about it, weather impacts every aspect of our life. It impacts our mood, what we wear, where and when we shop, what we eat, when we sleep -- the list is endless ...
So why don't we use weather more in our analytics and insights services to determine the real relationships and how to impact sales, ads, etc ... through the just announced availability of a range of weather APIs in IBM's Bluemix, anyone can now develop apps and services that integrate weather. This is just the start of a continued availability of services and solutions that integrate weather -- predictive, current, and historical - into our solutions.
Here's more info on the new announcement: https://console.ng.bluemix.net/catalog/services/weather-company-data-for-ibm-bluemix/
Yesterday, IBM and Cisco delivered a significant announcement around the Internet of Things. The announcement talked about Analytics at the Edge, but perhaps what is more important was that it set the direction for "Analytics Everywhere". When I look at the Internet of Things and the connectivity and information processing models it will entail, it is a combination of most every model we have seen throughout the history of IT. There will be use cases that require "a mothership" as the control point, and master information source. There are scenarios where much of the processing can be handled "at the edge" with decisions and responses driven in isolation at the edge. There are scenarios that will require peer communications between "things" that are co-located or perhaps located across the globe from each other. There will be "cloud to cloud" information interchange requirements ... and security challenges at levels we have never imagined.
The "Internet of Things" is about business transformation, and the flexibility of being able to effectively and efficiently process information at whatever space is appropriate will create new new opportunities. Yes, we have seen some of these scenarios "in prior lives", but we have never seen the ability to develop and deliver the capabilities at the level of cost effectiveness we can today, nor have we ever considered the level of instrumentation that we can now do. Sensors are inexpensive and they enable a breadth of information that would have been unimaginable only a few years ago. Connectivity is nearly ubiquitous and inexpensive. Data storage costs are at levels where you can afford to keep data you aren't sure you will ever need, but might. And Analytics capabilities are becoming simple to deliver and available - with Watson Analytics being a great example of that evolution. #watsonIOT #IOT
Modified by fletchjibm
I suspect many of you are scratching your heads today, saying "IBM bought the Weather Channel???". Well first off, we bought "The Weather Company" which includes an amazing cloud-based data ingestion and analytics platform, a very successful B2B business that includes a solution that makes all of my airline flights smoother by providing plane to plane awareness of bumpy air, a proven weather prediction capability that has some of the highest accuracy in the world, and a B2C business that includes a smartphone app that is the most used app in existence today.
And when we think of weather, its much more than knowing it will be sunny tomorrow. Its detailed micro-forecasts that allow us to know the specific weather for a specific local at any time of the day. Its the ability to be pre-warned of impending weather events like extreme winds, lightening, or hail storms. Its heat indexes, windchill, dew points, pollen indexes, UV indexes, etc etc etc. All of these elements of weather when coupled with the awareness that the Internet of Things will bring as everything becomes connected created so many new opportunities.
Now lets think about all of this data in a world that has the Cognitive Insight capability that IBM Watson provides and will grow to provide ... the possibilities are truly endless.
As is always the case with CES, exciting announcements are made. Today a new standard for Wifi was announced that should be available in 2018. Here's an article on it: Halow Wifi
The new standard focuses on improved range (up to 2x) and lower battery consumption. I have seen some articles focused around larger bandwidth hogs like file transfers, but as we look at IOT solutions, we're going to see a need for efficient connectivity, and exchange of small bursts of information. We could see apps that may have looked at BLE but now can leverage Wifi Halow instead.
Many have also positioned the technology more around the Smarter Home, but it would equally apply to a range of offerings as IOT will drive wide spread connectivity needs.
Lots of possibilities -- another great space to watch.
IBM last week reaffirmed its commitment to Watson and the value that Cognitive Computing will bring to our future. With IOT we're moving to an era of massive data, driven by orders of magnitude reduction in the cost of collecting and maintaining that data. But what do we have when we have lots of data -- well the answer is simple -- "lots of data". With billions of devices potentially connected and interconnected, traditional approaches to leveraging operational data will become unmanageable. Why collect all of this data if we're not going to do something of value with it?
So what's the answer? Well, Cognitive Computing will become a key aspect of the value for IOT. Cognitive Computing will allow us to draw insights that would have not been possible in the past. We'll find the root cause for failures faster, allowing us to further prevent failures through operational insights provided by Cognitive techniques. We'll be able to turn "lots of data" into truly actionable insights, and provide them to technicians in a prescriptive form.
Think about how Watson has transformed the medical field. Medical personnel are now presented with a list of diagnoses with a probability for each based on analysis of the symptoms. What prevents us from applying these same techniques to inanimate "things"? Why not leverage the wealth of data being collected from "things" to feed a Cognitive system which produces not only a list of potential failures with a confidence factor about their possibilities, but more importantly what if a Cognitive system provided insights into maintenance - where maintenance operations become individualized based on operational environments potentially saving millions in unnecessary maintenance, while also eliminating failures.
We're truly at an inflection point in the possibilities and opportunities to revolutionize our future.
As I am sure many of you have heard by now, IBM is making a major investment in the Internet of Things (IOT). On Thursday April 9th, IBM will have a web streamed event around our announcement and what it will mean to our customers and the industry. Here's a link to the live event: IBM IOT Event Link.
With its strong capabilities in streaming and historical analytics, its IOT Foundation for data collection, its breadth of database capabilities, and its Bluemix and Cloud solutions IBM is well positioned as a major player for the Internet of Things. Couple that with the industry expertise that IBM has built over the years in areas such as Asset Management with Maximo, Smarter Cities, and Smarter Buildings. As real time information will become available, so many aspects of how our clients do their jobs will forever change.
And the change will start with the engineering. Engineering and operations will digitally integrate. Rich visualizations, coupled with augmented reality capabilities will improve safety, reduce maintenance, and enable workers to better perform their roles.
Its an exciting time, and I am so glad to be in the middle of the IOT revolution.
Modified by fletchjibm
The non-tech media has finally caught wind of the Internet of Things (IOT) and is highlighting what happens when you don't secure your solution. They are showing cars that have their braking and steering systems controlled remotely. They are talking about home automation systems being hacked with lights and heating being remotely controlled by the hacker. They are talking about home appliances being turned on and off without the homeowner intervening. Just today USA Today featured an article on Congress wanting to hold special workgroups around the area of IOT security -- USA Today Article
So while this makes for great news stories, the need for security in any connected solution has always been critical. In this modern connected world, the ability to hack in and take control will happen unless we do something to prevent it. Any IOT solution requires security at the connection level to assure that the control "pipe" is not hijacked and used for unwanted purposes. IOT security is also needed at "the thing" to assure that it cannot be taken over and controlled by an undesired source, such as a hacker. A good IOT solution would also place bounds on what "the thing" is allowed to do and actions when "things" are controlled outside of normal bounds. For example, who would preheat their oven for 8 hours -- the "smart oven" would reset itself if preheated for too long.
But there is also a new class of security needed for many of those "things". Location - is the "thing" where you thought it was. Has it been moved to a different location, so that the stop light you though you were managing is now no longer where you thought it was. It might sound far fetched but think of the implications?
The net is that we have to be concerned with security with any intelligent and connected solution. Whether being hacked with a USB device introducing malware, hijacking the device over the network, or simply signing in with a weak password (perhaps the biggest concern) the impact is real and we have to assure whatever we build, that security is a major focus of the solution.
Nothing like titling a blog entry with three TLAs (thats three three letter acronyms). But when I look at technologies driving change, these two will drive a 180 degree change in how disciplines like Smarter Buildings and Enterprise Asset Management evolve.
Building Information Modeling (BIM) allows us to define, operate upon, and visualize "anything" ... and visualize it in context of everything around it. The Internet of Things (IOT) will allow us to collect real time operational information about everything ... and aggregate that information so that I understand the relationship between the "things" and derive patterns about how the things interoperate.
So now what do I have -- by integrating the ability to accurately visualize, drill in, rotate, look inside, etc etc etc and to understand how things are operating now and in the past, and predicting how they will operate in the future without ever having to physically touch the "thing" we're enabling an infinite realm of possibilities to improve operations, improve health and safety, and forever change how we do our jobs ...
Exciting times to come ....
I attended the second meeting of the Raleigh Internet of Things group yesterday. It was amazing to see how many people from so many disciplines that were interested in meeting for an exchange session on the Internet of Things. The excitement and the possibilities of what IOT will mean to all of our lives is huge.
Yesterday's meeting had presentations from 4 different presenters representing both large and small companies. Over 100 people attended. What is clear is that IOT will be an entrepreneur's paradise. Everything will become connected and interactions between things that we never have imagined would occur WILL occur.
One example from yesterday's meeting was from Big Belly Solar. The solution is a connected trashcan which can report when it needs to be emptied. Connectivity is provided by a 3G connection, power supplied by solar, and the ROI for a trashcan that costs 5x what a traditional can would cost has been shown to be less than a year. The savings come from personnel and truck roll cost reductions, and the a side effect will be immediate awareness of an "out of the normal" usage pattern that has an overflowing trashcan needing attention.
Think of the possibilities -- so much fun to come ...
As we look through the history of user interaction with a computing device, we have seen numerous transitions in how we interact. There have been attempts at many alternative approaches -- many of which never made it and some which made it then quickly disappeared. Many of you won't remember the "light pen" for example which was used to select data on a screen by touching the screen with the specialized pen. The mouse was by far one of the greatest interaction inventions, but even that was replaced by a touchpad device, and gestures as a means to replace specific clicks and keystrokes. The touchscreen will eliminate those before we know it across all devices.
The Mouse and the touchpad are indeed interaction models which I expect to see disappear over the next few years. The movement to touch screen and hybrid devices, and the need to create visualizations that can be interacted upon by touch devices will end the era of the mouse. We'll move to touch screen devices, and eye movement recognition for many of our applications.
So what does that mean to us? It means that we have to develop our applications with the reality that a finger may be the "selection tool" of choice. Gestures will move us from place to place. As we know, that means larger "targets" and more space and larger fonts in our text selection areas, and intuitive logic for our gestures (and of course, lots of cloths to clean those screens).
The rumors are that Apple will release an 11 inch Ipad .... watch as this size device or larger becomes the "desktop" of choice .... One thing for sure about the space we're working in .... you can never get bored .... and if you do, you will be irrelevant before you know it.
No this isn't an answer from Jeopardy ... the recent announcement between IBM and Apple is a key indicator of where the enterprise world is headed. We're moving from desktop fixed location systems to mobile systems. When we asked one of our transportation customers what they saw as their preferred device in the future, they indicated that tablets (aka iPads) will be the norm. But why? This is where the Internet of Things ties into the picture.
Much of the operational information we collect today (when we do collect it) is manual, delayed, or disconnected from the the management system. As we move to a "everything is connected" world, we'll now have the data to drive analytics to recommend and drive actionable information to these devices so that actions can be taken before problems ever happen. The ability to aggregate information from a range of systems, easily, and intuitively and make that information available to the appropriate person in near real time.
So Apple's devices coupled with IBM's Enterprise and Big Data skills, coupled with the emergence of the Internet of Things ability to collect data from "any thing" is a great combination ....
There was a great article in Forbes recently talking about the Internet of Things and the impact it will have on Retailers. With solutions like IRIS from Lowes, connectivity to a wide range of "things" in the home is already. While starting around energy with thermostats, and lighting control, they are expanding their abilities. Just recently Orbitz announced an IRIS-enabled (z-Wave) water timer for example. Lowes also introduced their MyLowes card a year or so ago. While the public messaging was around helping you track your returns, the reality is that Lowes knows what you bought and when you bought it.
Think about the possibilities of combining the knowledge of what you purchased with the ability to connect and communicate. Lowes would be able to set the watering timer daily based on the type of plants you bought, the weather conditions, and the locale of your home. They might even add a moisture sensor to the plant to further complement the solution.
Look at all of the systems that can gather knowledge and interact, and be accessed by a human-driven smartphone today, but why not my indepth analytics and recommendations engines tomorrow.
The real value in the connectivity and the knowledge will come from the combination of knowledge. While everyone is focused on Nest and what Google is doing, Lowes is on a much path towards a deeper real knowledge of your home and everything about it, while adding real values for the consumer.
We're so early in the journey still. Many more players will emerge, and new technologies will be spawned, but the potential is endless.
I had the opportunity to speak at the 59th annual European Union of Electrical Wholesalers conference in Copenhagen Denmark. While I was there to talk about Smarter Buildings and to point out that a Smarter Building is determined by much more than just direct energy consumption. Overall operational efficiency, effective lease management, effective space management, project management, and maintenance procedures all can impact the "indirect" energy cost of the building. The message rang well with the audience and they appeared to agree with my thoughts.
But perhaps what was more intriguing was the parallel changes that need to occur in their industry as are occurring in the IT industry. They are moving from a model of "sell a component" to a model where they need to establish and maintain a long term relationship with the client through value-add services. The "things" themselves will evolve substantially as they take on intelligence, become connected, and become "alive" through new functionality provided through software/firmware updates to the object whether it be something as simple as a light, or as complex as an HVAC system.
Gartner says "The Internet of Things ("IoT") is expected to grow to 26 billion things by the year 2020, representing an almost 30x increase over the 900 million things connected in 2009. " While many of these "things" will be consumer devices, there will be a large number that are evolving building elements. The ability to connect and dynamically change has the potential to totally change how wiring is done for example, with the introduction of "wireless network connected" switches instead of traditional hardwired approaches used today.
The parallel to IT comes in the movement to continuing services and adaptation of systems. There will be considerable opportunity to manage, update, and improve efficiency through the acquisition of data from these "things", applying domain specific analytics, and then driving change dynamically. Truly intelligent -- and somewhat living "things" ... for IT, this means continuous delivery and Software as a Service -- for "thing" developers, this means making "things" have the ability to "become smart" and then driving continual change into those "things" as part of an evolving business model.
I was googling today to see how different groups defined the Internet of Things (IOT). It was interesting how all of the definitions seemed to miss what the Internet of Things really will be and what it will mean to all of us. The focus seemed to be more around connectivity than around the impact that IOT will have.
I thought of an analogy that really seemed to drive home the potential impact to all of us. Remember back in the "olden days" when we would mail a letter, mail an order or deliver an invoice, or write a memo for an admin to deliver to a coworker? How long was the turn around time for a simple communication -- days or sometime weeks?
Then along came email. We now can exchange information in seconds, and get responses instantly. We can make complete information exchanges that previously took an extended period of time instantly.
So where are we with "things" today? We're in the pre-email phase. We either don't collect information or we collect it with visual readings of meters, documented on paper, and placed into a binder on the cabinet shelf. Analytics are non-existent or historical in nature.
Think of what will happen when "things" are connected and can send information at whatever interval they need to. Think about what can happen when that data can be immediately analyzed to determine the operational efficiency and to drive actions to resolve any issues -- all in near real time. We've talked about refrigerators and washing machines, but to me those make really neat demos, but don't drive the real value we'll see as "everything" becomes connected and can send data that can be aggregated to determine trends and root causes in ways that never were before possible.
Also recognize the impact that this electronic communication has had on traditional mail delivery and on industries that were targeted at "paper data collection". We'll likely see new industries emerging, and expectations around "instantaneous information exchange" as the norm.
The transition to IOT will happen quickly and the impact will be huge. And while the analogy to email is a strong one on the positive side, we'll also see many of the issues and concerns we see (and hate with email as well). Security will be critical both in assuring that the device we're talking to is the device we're think we're talking to, and assuring we're not overwhelmed with "unused" data. To be successful we must assure that data is turned into "actionable information".
Is it all in a name? -- while IBM has traditionally referred to "it" as Smarter Infrastructure or Smarter Planet, the industry has spoken and "the Internet of Things" has emerged. The Internet of Things is really a great descriptor of the revolution that is happening. Things are connecting. Things are becoming intelligent. Things are interacting. Things are ... or will be ... everywhere and doing everything ...
But what are "things" -- the interesting reality is that we only know about a small percentage of "things" today. Many of the "things" we'll manage don't even exist today. Many of the "things" we'll manage will be evolutions of things we have today.
We've talked about internet connected refrigerators and internet connected washing machines for years now ... why haven't we seen every one of these appliances become mainline then? Why aren't people lined up at the store to get the latest greatest connected device. It all comes down to perceived value by the consumer and business cases by the provider? Is it worth $10 a month to know your refrigerator will fail in 12 years? Probably not ... Do we need a washing machine that goes to the internet to find out what the settings should be on the washing machine for a specific fabric -- probably not when a consumer today never even uses the additional settings on the machine they have.
So why is everyone so excited? The real value of IOT isn't going to be a single appliance or a single solution. Its going to be around the interaction of systems and the analytics that can process the cross-discipline streams of information to product insights that improve our lifestyles, improve our safety, improve our efficiency, etc etc etc.
We're starting to see the Internet of Things in talked about daily it seems. Now we need to find the real use cases that drive the value needed to make it real. IOT will and is happening ... we're 2 inches into the 100 mile journey ... its going to be a great ride.
Building Information Modeling (BIM) is changing how "things" are designed. We're moving away from 2D real and digital paper to 3D designs that we can look at, change, and "virtually construct". Its unfortunate that BIM is called "building" because so many people think of it as only for "buildings", but it really applies to any "thing" we want to construct. BIM can be used to represent the finished building, but can equally be used to decompose a complex asset like a pump to virtually "look inside".
Even Wikipedia errs on the side of "buildings" with its definition of BIM saying:
Building Information Modeling (BIM) is a process involving the generation and management of digital representations of physical and functional characteristics of places. Building Information Models (BIMs) are files (often but not always in proprietary formats and containing proprietary data) which can be exchanged or networked to support decision-making about a place. Current BIM software is used by individuals, businesses and government authorities who plan, design, construct, operate and maintain diverse physical infrastructures, from water, wastewater, electricity, gas, refuse and communication utilities to roads, bridges and ports, from houses, apartments, schools and shops to offices, factories, warehouses and prisons, etc.
With the ability to digitally represent assets and even their subcomponents, the process of "handover" or "commissioning" is now able to change from one of complex manual (and error prone) tasks to one of automated integration of the design and operations phases. In fact, if we approached this the way it should be, we'd actually integrate the assets into the asset management system as they come online, and at commissioning the operator would have full knowledge of the asset history and maintenance. We'd leverage the BIM models created to construct the building, maintain them during operations, and integrate them into the operations process.
We've enabled the import of the BIM model into Maximo and the ability to incrementally update the information as changes occur. Its now possible to leverage the BIM model throughout the lifecycle of the asset. We can visualize the asset in 3D, drill into it, rotate it, look "inside" all through the power of BIM integration.
I found the CES announcement from GM and Audi that they are going to provide LTE connectivity in the cars to be very interesting. If you look back 15 years ago, GM provided its OnStar service leveraging analog cell service that was "embedded" in the car. Some of you may remember a few years ago, when the cellular carriers decided to sunset their analog offerings, and GM (and more importantly its customers) were stuck with mobile connections that no longer functioned. Given the lifecycle for a car versus the rapid evolution of mobile services, speeds, technologies, and protocols, why aren't we going to see this "technology sunsetting" issue repeat? How will consumers have flexibility in pricing and in carrier selection if the auto manufacturer controls that selection?
The model of tethered connectivity leveraging one's personal device connected to the car via Bluetooth has shown rapid success and acceptance. It provides the ability for consumers to have choice in providers, plans, and capabilities. What it doesn't provide is "lock in" to a specific carrier for the car manufacturer or carrier ...
It will be interesting to see how this all plays out .... my bet is that a lock in model will fail -- only time will tell ...
As we look at a Smarter Infrastructure, we first see an abundance of new information sources, some providing huge volumes of data. While many of us will do with that data, what we did with our "clipboard data", which is to simply place the data into our "filing cabinet", others of us will realize the importance of the data as we apply analytics and turn "data" into "actionable information". Analytics is a very overloaded term, being associated with everything from basic KPIs to complex models. The reality is that any analytics, when applied appropriately, can make a huge impact on how efficiently and effectively we operate, and can generate huge potential savings in operational costs and personnel time.
IBM today unveiled a new analytics offering aimed at improving operational efficiency of complex assets. The solution, which can be tied to an asset management system like Maximo to maintain asset history, and drive asset associated operations, is a great step forward in the application of more complex analytics.
A colleague of mine wrote a great blog entry that helps one understand the role of analytics -- in terms many of us can understand -- here's a pointer to that blog:
I recently did a presentation at our Pulse Conference on Smarter Infrastructure -- what is it and how it will impact us.
world is fast becoming smarter -- sensors everywhere, huge masses of new
data sources, the need for rapid, yet informed decision making,
optimized processes -- the list goes on and on. How will we make this
happen? How will me maximize our return? How will we turn these masses
of new data into actionable information? Regardless of the industry
you are in, regardless of whether you are public or private sector, the
evolution to a Smarter Infrastructure is real and will impact all
aspects of our lives.
Here's a link to the video:
On Wednesday of this week, I visited my local Sam's Club and purchased a tub of spinach as I do every week it seems. But this week was unique. When reading my email on Thursday evening, I receive an email from Sam's Club saying:
Taylor Farms Pacific Inc – 1 lb Spinach RecallDear Sam's Club Member:Today, we were notified that Taylor Farms has initiated a Recall of its 1 lb Spinach product due to the potential presence of E. coli.Taylor Farms has asked us to recall any of this product with a “Best if Use By” of 02-24-13.
Our records reflect that you may have purchased the 1 lb Spinach product with a UPC number of 0003022304780 and a “Best if Use By” of 02-24-13.
What could have been a widespread incident instead became a well organized targeted recall. Sam's had not only tracked its supply chain, but also had associated the supply with the consumption. For all of you privacy folks, yes, Sam's did know that purchased spinach and wine on Wednesday, but they also were able to notify me of a potential health threat immediately. Yet another example of what happens as we enable our infrastructure components to communicate.
Today is a great day for the datacenter. IBM and Emerson have announced a partnership which combines IBM's IT Service Management (ITSM) with Emerson's Trellis offering which was recently recognized as the industry leader in Data Center Infrastructure Management (DCIM). Gartner has said that the DCIM market is an estimated $450 million market today, and expected to grow to $1.7 billion by 2016.
But why all the excitement about this announcement? Anyone that has been in this industry recognizes that the datacenter has often been operated as a series of seemingly disconnected silos. One team manages power distribution, another manages the cooling infrastructure, another manages the physical placement of machines into racks, etc etc etc. When an operational problem occurs, we fall back into that siloed mentality with "twelve people on a bridge call" trying to determine where the problem actually originated. There is little automated "root cause" analysis and even less automated action across the silos. I recently heard of a major customer who had a chiller issue on a Sunday afternoon -- the IT team discovered the issue when the applications began to fail because the servers were failing from overheating.
Why weren't the systems connected? Why didn't a chiller failure signal the IT team and indicate which racks would be impacted and perhaps automate an action to move the workloads or throttle down the servers until the chiller issue was resolved? The answer unfortunately is simply because the operations of the systems were not connected.
With the IBM Emerson partnership, we've establish a base system for interlocking power management, cooling management, and traditional IT management. We're now providing a system which enables automated awareness of each slot in a rack -- what is in the rack? What is its power draw? What applications are running on the rack in that slot? We're now getting information, that can be integrated and leveraged to turn the datacenter into a "smarter datacenter".
Lots of great possibilities ...
For some time I have been talking about the need for, and value of the introduction of IT technologies into the operation space. The reality is that the operational space is becoming more IT like in its technologies and implementations. Mechanical devices are being replaced by or augmented with intelligent processors and sensors. They are being connected to (either physically or logically) to networks. These formerly isolated mechanical systems are now becoming connected "IT" endpoints. They are now more susceptible to network attacks. They are now software devices, and will need software updates -- whether for maintenance or for new capabilities. With the replacement of the man with the clipboard, why limit readings to once a day - why not once a second? Why not monitor the systems in near real time (both the endpoint and the collector).
The possibilities are endless and the challenges are many but the reality is that the convergence is happening and will drive a fundamental change in how we architect, operate, and maintain these systems.
As I sit in Sao Paulo Brazil and look across the city, all I see is a maze of buildings for as far as I can see. Are these buildings "Smarter"? My wager is that most (all?) of them are not. There are many ways to make a building smarter. Most articles I have read over the last few years have been around "energy consumption". While energy consumption is an important aspect for reducing the operational cost of a building, there are many additional ways to reduce costs while improving the satisfaction of their tenants.
Through's the acquisition of Tririga, IBM entered the Integrated Workplace Management System (IWMS) marketspace. IWMS is an industry term for the broader management of all aspects of the building, from financial, portfolio management, to basic operations such as office key management. Now we can manage our portfolio and understand how to improve it. Is a lease about to expire and we have to decide whether to renew it? Are we managing office space as efficiently as possible? And now the building energy management solution that has shown significant success at Tulane University is now part of that same solution family.
So while energy management is definitely important, its not the only way to make your buildings, and building management, smarter -- and Smarter means better --
I had a great visit to Mesa Verde Colorado last week. I saw first hand an evolution of Smarter Buildings and how over 800 years ago people were constructing buildings that were energy efficient (OK, efficient for the materials they had). The buildings were constructed initially above ground but over the period of the development, they evolved to buildings that leveraged the Earth's consistent ground temperature by having the buildings dug into the ground initially only a foot or so but later up to 4 feet into the Earth. The advantage was that the consistent ground temperature provided some level of consistency for the overall building temperature. The buildings leveraged "free air cooling" through the use of a hole dug adjacent to the building that tunneled to the lowest level of the pit, providing a fresh air source for the building.
While many believe the Anasazi's lived for centuries in cave dwellings, they only lived there for a few decades -- after the "pit houses". But once again, they leveraged their surroundings to develop buildings that leveraged the Earth's temperature, now combined with water sources, and rock overhangs for the basis of the buildings -- as well as the protection that a cave dwelling offers from an access perspective.
Oh well -- Enough of my rambling for now ...
Over the past couple of months, I have had the opportunity to talk with customers throughout the world. The common theme I see is how little we really have done to make our buildings truly smarter. Whether its lack of awareness, or simply lack of caring, we're wasting abundant amounts of energy, and manpower, as we continue manage buildings the same way we have for decades. We have so much more capability that we can now take advantage of, yet in many cases, our capabilities go untapped.
Today was the beginning of a new era for IBM - we've been working with our industry partners to improve the energy and operational efficiency of buildings, and today we announced the availability of a bundled software solution that allows us to "listen to the building, and hear what it is telling us". From there we used our analytics to predict problems before they occur, or recognize problems when they occur while providing a mashup-based dashboard to visual the state of the monitored buildings.
Here's some press from the announcement:
IBM Unleashes Advanced Software
Solution for Smarter Buildings
formally introduces its Intelligent Building Management software today -- an
advanced solution that's being put to work at Tulane University's School of
Architecture, The Cloisters of the Metropolitan Museum of Art in New York, and
the company's 35-building facility in Minnesota.
The software is designed to be an
analytics and automation powerhouse that can help ramp up the environmental
performance of any building, even ones that are 100 years old or more.
The product is the latest in a steady stream of
solutions that IBM has
unleashed in recent months to make the management of buildings, the energy and resources they use, and the
transportation and virtual networks that connect them more efficient, more effective and
The software and its applications, which
are being detailed today in an IBM
Smarter Buildings Forum in
New York, also are the results of the company's steadily increasing collaborative projects, partnerships and acquisitions -- all of which are aimed at positioning IBM as a
dominant player in a nascent field
that brings together IT, the built environment, vehicles and
GreenerBuildings.com Executive Editor Rob Watson is the kickoff speaker at the forum today and GreenBiz.com Senior Writer Adam Aston will provide on-scene coverage of the event.
Here is an early look at the projects
that will be featured during the forum:
buildings can help owners and operators cut energy use by as much as 40 percent
and reduce maintenance costs by 10 to 30 percent, according to IBM.
- Tulane University in New Orleans is using the software to transform the
historic century-old Richardson Memorial Hall (pictured above), home of the
Tulane School of
Architecture, into what IBM
is calling a "smarter-building living laboratory." Johnson Controls is a partner
in the project.
Cloisters, the branch of the
Metropolitan Museum that houses 3,000 works of European medieval art, is using
IBM software and its wireless environmental sensor network called the
Lower-Power Mote to preserve the collection.
- IBM's facility in Rochester, Minn., is realizing further energy savings using the
Intelligent Building Management system. The complex of 35 interconnected
buildings that make up the 3.2-million-square-foot manufacturing and development
facility has undergone several waves of efficiency improvements since the site
opened in 1956 with a half million square feet of workspace. The company said
it's achieving further year-over-year incremental energy savings and as well as
saving in equipment operating costs with the
While technology advancements in building
management systems have made it possible to cull an immense amount of data on
structures, the challenge has been to organize, analyze and present it swiftly
to building owners and operators so they can proactively manage their properties
-- as IBM Smarter Buildings Vice President David Bartlett said at GreenBiz Group's State of Green Business
Forum this year.
The new software, which is supposed to be
the most comprehensive product thus far in IBM's smarter buildings arsenal, is
intended to address that need.
Earlier this week, IBM introduced its
Intelligent Operations Center for Smarter
Cities. The plug-and-play,
smarter-cities-in-a-box solution is expected to deliver high-powered systems and
network management capabilities to communities without the high price tag that
usually affixed to such technology.
As I speak on Smart Grid panels and work with members of our utility companies, I have been somewhat amazed at where the utilities are drawing their line of demarcation. The meter is the end of the world from the utilities perspective and they appear to have no desire to look beyond the meter -- into the home, or commercial building -- to better manage power and to understand more about the specific power usage by the consumer. There are emerging companies that are providing intelligence on the consumer side of the meter, but the utilities continue to stand clear of that space. But why -- well the reasons are many from security concerns around collecting and managing that level of data, to the fact that many utilities are still regulated and therefore have little real incentive to pursue more intelligence at the endpoint.
So what does this mean to the utility? I love parallels and I have to draw a parallel between the smart grid boundaries and what I have seen with my Internet Service Provider. Many years ago, the ISP was one's first point of access whenever one connected to the internet. The ISP provided email, and was the link to all services one wanted on the net. But that is no longer true. I personally never access my ISP's web portal, instead preferring other portals such as Google as my link to the web. My ISP is still there, but they now have no ability to obtain additional value from my connection through them because they have become simply a pipeline to the internet.
It appears the utilities are setting themselves up to become simply a "power provider" with minimal additional values to the consumer. For commercial buildings, solutions such as IBM's Intelligent Building Manager will be well positioned to talk to the "Smart Grid" while providing the intelligence needed to truly make a difference for the consumer. The utilities will continue to drive smart grid, but primarily for the producer side and the benefits that the utilities will see as they modernize the power distribution network. There is immeasurable value in that for the producers, and for consumers in the form of better availability, etc but much of the hype around the smart grid for two way interaction with consumers is just that -- hype ....
As I watch the Jeopardy match between IBM's Watson and some of Jeopardy's smartest I am quite impressed with the breadth of contextual data munging (that's a technical term) that Watson is able to accomplish and the confidence factor for the results. The ability to draw together seemingly unrelated pieces of information, and determine the context and nuances around those contexts is quite impressive, and sets the stage for what we can expect as we look into our crystal ball for the future.
If one draws parallels between what is being accomplished on Jeopardy, with the future of our Smarter Planet initiatives the role of analytics in pulling together what is seemingly very disparate data and determining correlations, and recommendations from this data will provide a significant level of intelligence around which to make ongoing improvements, or even to drive the development of new technologies -
Now if we could only tell Watson that Toronto is not a US City :-)
How quickly things can change -- a year ago it looked like the US would follow Europe and we'd be legislated into a cap and trade economy. There was considerable talk about the implications this would have on prices, etc ... What a difference a year makes .. the change in Washington was followed almost immediately by the collapse of the Chicago Carbon Exchange, an entity in which Al Gore has invested. So what is the future of cap and trade? Short term death is a near certainty but longer term, only time will tell.
The concept of cap and trade is interesting to say the least. The idea being that those that are inefficient in their processes can get credits from those that have found ways to be more efficient -- carbon credits become a currency of sorts -- but the hope is that overall world impact is an improvement in overall emissions and environmental impact.
So lets see what comes next and see how the US reacts as other countries continue on their legislative approach to cap and tax -- I mean cap and trade.
As I look at many of today's buildings, I am somewhat amazed at how "unsmart" many (most??) of them are. Lights glow bright next to southern facing windows on a sunny afternoon -- heat and air conditioning run simultaneously in the same room -- fans blow and blow and blow, for no apparent reason -- thats just to name a few of the many energy wasting items I have seen in just the last week.
Why would any building owner allow that to happen? Well there are many reasons but two key ones are lack of any awareness of the amount of energy being consumed that need not be, and in too many cases, lack of any responsibility for the energy bill therefore no direct real interest in reducing that cost.
But what is changing? We're seeing an interest in instrumenting buildings -- collecting information from seemingly disparate systems and bringing that data together into a consolidated form, and leveraging that data to take actions, and use analytics to predict the future trends. Its not rocket science that is needed -- its basic blocking and tackling - and the financial return and perhaps more importantly the ecological return is something we all should start to care about. Money spent on energy is an expense to a company or individual .... that same money could be used to expand the company, and create new jobs by improving the bottom line ...
As I watch more attention be focused on the "green datacenter", I was amazed that I had not yet seen someone talk about a "Smarter Datacenter". A smarter datacenter would reflect the wide range of improvements that one can make within the datacenter, whether it be improved processes, more effficient equipment, facilities improvements, or virtualization. While many of these areas are not even mentioned under energy management solutions, they are all part of making a datacenter smarter, and a "Smarter Datacenter is a Greener Datacenter".
So as we look at quantifying the energy impact of datacenter improvements, it isnt just about more efficient servers, or improved cooling -- its an aggregation of all we do as we work to improve datacenter efficiency, and thus reduce our overall energy impact.
Last week I was on vacation in Orlando. The place where I stayed was highlighting all of the "greening" they had done. CFL bulbs everywhere -- now sponsoring recycling (even though we had to drive 1/2 mile to recycle 3 aluminum cans). Wow -- I should have felt so good about this place ...
Well -- SHOULD is the right word ... as nightime came, I looked at my entrance door, and on all sides I could see light from the outside- not just a little light either -- so as my air conditioner ran and ran, cooling the Florida landscape, I had to ask -- "is this place really serious about Green"? So while its great to talk about green its time to be serious about BEING green instead ...
Many customers continue to measure temperature of their server racks using the "back of the hand" method. Unfortunately this is exactly what it says -- they walk the aisle with the back of their hand extended and when they feel a warmer than normal area, that is an area to be looked at further. Well, yes, that's not exactly scientific, but it has worked for years. Likewise power consumption was pre-determined from manufacturer's specs which generally means it was grossly overvalued.
But as we look at better optimizing our overall energy consumption, even a degree or two difference can make a big difference in our overall energy efficiency. The "back of the hand" method cannot provide that level of accuracy, so newer methods need to be implemented. Over the last few years, IBM has introduced direct measurement within their server family for both power consumption, as well as temperature reporting. With the direct reporting of this information, immediate and accurate information can be available and leveraged.
With the availability of more accurate information in a timely matter, datacenters can reduce their "energy buffer" . Typically customers have over-cooled, and over-powered. With the ability to detect even small deltas quickly and accurately, these buffers can be reduced and therefore overall energy consumption can be reduced.
But how does one get access to this information? Tivoli's ITM for Energy Management collects this server information from its embedded Active Energy Manager component. The data can be thresholded with events generated automatically when measured values exceed expected values. Reports can be generated or the information can be visualized in an operations console.
Having accurate and detailed information is just one element of an effective overall datacenter energy strategy -- but a very important one for sure.
Well despite months of desire, and a total lack of spare time - I finally become an official blogger today. Thanks to Jeff Jenkins for his help in getting this going.
Over the last two years, as I have pioneered the energy management space for Tivoli, I have seen leading organizations begin to recognize that the historical organizational structure around datacenters does not represent well the needs for improved energy management. Unfortunately, in most datacenters, the team responsible for cooling and power, and the team responsible for IT (servers, applications, storage, etc.) report into different lines of the business. Even more unfortunately from an energy management perspective, neither organization is responsible for paying the power bill, and in most organizations, neither team is even aware of the power bill.
As a result, there is no natural incentive to reduce overall power consumption, unless some external factor like availability of power comes into play. This "green organizational disfunctionality" results in wasted spending on energy, and operational inefficiencies given that there is also limited integration between the multiple organizations responsible for the datacenter. Even when knowledge does exist within the IT organization with regards to power consumption, I have yet to see a datacenter, where the IT team is measured in any way on power consumption - instead, availability and performance are the two measurements that matter.
So how can we expect energy efficient datacenters if organizationally there is little focus, and no incentives are provided to reduce spending on energy, Thats the challenge that organizations need to address. I am seeing an emergence of limited discussions between these multiple teams, and I am seeing an occasional "incentive" from the c-level exec to begin looking at how to reduce energy costs, but only occasionally. Instead, most energy reduction today is coming from "tangential" changes such as virtualization.
For those customers who have made a focus on looking at the entire energy consumption lifecycle, significant cost reductions -- sometimes approaching 40% have been seen.