The Internet of Things - Rambling Thoughts by Jim Fletcher
Last week I was on vacation in Orlando. The place where I stayed was highlighting all of the "greening" they had done. CFL bulbs everywhere -- now sponsoring recycling (even though we had to drive 1/2 mile to recycle 3 aluminum cans). Wow -- I should have felt so good about this place ...
Well -- SHOULD is the right word ... as nightime came, I looked at my entrance door, and on all sides I could see light from the outside- not just a little light either -- so as my air conditioner ran and ran, cooling the Florida landscape, I had to ask -- "is this place really serious about Green"? So while its great to talk about green its time to be serious about BEING green instead ...
Well despite months of desire, and a total lack of spare time - I finally become an official blogger today. Thanks to Jeff Jenkins for his help in getting this going.
Over the last two years, as I have pioneered the energy management space for Tivoli, I have seen leading organizations begin to recognize that the historical organizational structure around datacenters does not represent well the needs for improved energy management. Unfortunately, in most datacenters, the team responsible for cooling and power, and the team responsible for IT (servers, applications, storage, etc.) report into different lines of the business. Even more unfortunately from an energy management perspective, neither organization is responsible for paying the power bill, and in most organizations, neither team is even aware of the power bill.
As a result, there is no natural incentive to reduce overall power consumption, unless some external factor like availability of power comes into play. This "green organizational disfunctionality" results in wasted spending on energy, and operational inefficiencies given that there is also limited integration between the multiple organizations responsible for the datacenter. Even when knowledge does exist within the IT organization with regards to power consumption, I have yet to see a datacenter, where the IT team is measured in any way on power consumption - instead, availability and performance are the two measurements that matter.
So how can we expect energy efficient datacenters if organizationally there is little focus, and no incentives are provided to reduce spending on energy, Thats the challenge that organizations need to address. I am seeing an emergence of limited discussions between these multiple teams, and I am seeing an occasional "incentive" from the c-level exec to begin looking at how to reduce energy costs, but only occasionally. Instead, most energy reduction today is coming from "tangential" changes such as virtualization.
For those customers who have made a focus on looking at the entire energy consumption lifecycle, significant cost reductions -- sometimes approaching 40% have been seen.
Many customers continue to measure temperature of their server racks using the "back of the hand" method. Unfortunately this is exactly what it says -- they walk the aisle with the back of their hand extended and when they feel a warmer than normal area, that is an area to be looked at further. Well, yes, that's not exactly scientific, but it has worked for years. Likewise power consumption was pre-determined from manufacturer's specs which generally means it was grossly overvalued.
But as we look at better optimizing our overall energy consumption, even a degree or two difference can make a big difference in our overall energy efficiency. The "back of the hand" method cannot provide that level of accuracy, so newer methods need to be implemented. Over the last few years, IBM has introduced direct measurement within their server family for both power consumption, as well as temperature reporting. With the direct reporting of this information, immediate and accurate information can be available and leveraged.
With the availability of more accurate information in a timely matter, datacenters can reduce their "energy buffer" . Typically customers have over-cooled, and over-powered. With the ability to detect even small deltas quickly and accurately, these buffers can be reduced and therefore overall energy consumption can be reduced.
But how does one get access to this information? Tivoli's ITM for Energy Management collects this server information from its embedded Active Energy Manager component. The data can be thresholded with events generated automatically when measured values exceed expected values. Reports can be generated or the information can be visualized in an operations console.
Having accurate and detailed information is just one element of an effective overall datacenter energy strategy -- but a very important one for sure.
fletchjibm 06000088KB 1.198 Visualizações
I suspect many of you are scratching your heads today, saying "IBM bought the Weather Channel???". Well first off, we bought "The Weather Company" which includes an amazing cloud-based data ingestion and analytics platform, a very successful B2B business that includes a solution that makes all of my airline flights smoother by providing plane to plane awareness of bumpy air, a proven weather prediction capability that has some of the highest accuracy in the world, and a B2C business that includes a smartphone app that is the most used app in existence today.
And when we think of weather, its much more than knowing it will be sunny tomorrow. Its detailed micro-forecasts that allow us to know the specific weather for a specific local at any time of the day. Its the ability to be pre-warned of impending weather events like extreme winds, lightening, or hail storms. Its heat indexes, windchill, dew points, pollen indexes, UV indexes, etc etc etc. All of these elements of weather when coupled with the awareness that the Internet of Things will bring as everything becomes connected created so many new opportunities.
Now lets think about all of this data in a world that has the Cognitive Insight capability that IBM Watson provides and will grow to provide ... the possibilities are truly endless.
fletchjibm 06000088KB 1.263 Visualizações
Over the past couple of months, I have had the opportunity to talk with customers throughout the world. The common theme I see is how little we really have done to make our buildings truly smarter. Whether its lack of awareness, or simply lack of caring, we're wasting abundant amounts of energy, and manpower, as we continue manage buildings the same way we have for decades. We have so much more capability that we can now take advantage of, yet in many cases, our capabilities go untapped.
Here's a short video from the VERGE conference I spoke at in London earlier this year - enjoy -- Smarter Buildings VERGE Event
fletchjibm 06000088KB 1.484 Visualizações
IBM last week reaffirmed its commitment to Watson and the value that Cognitive Computing will bring to our future. With IOT we're moving to an era of massive data, driven by orders of magnitude reduction in the cost of collecting and maintaining that data. But what do we have when we have lots of data -- well the answer is simple -- "lots of data". With billions of devices potentially connected and interconnected, traditional approaches to leveraging operational data will become unmanageable. Why collect all of this data if we're not going to do something of value with it?
So what's the answer? Well, Cognitive Computing will become a key aspect of the value for IOT. Cognitive Computing will allow us to draw insights that would have not been possible in the past. We'll find the root cause for failures faster, allowing us to further prevent failures through operational insights provided by Cognitive techniques. We'll be able to turn "lots of data" into truly actionable insights, and provide them to technicians in a prescriptive form.
Think about how Watson has transformed the medical field. Medical personnel are now presented with a list of diagnoses with a probability for each based on analysis of the symptoms. What prevents us from applying these same techniques to inanimate "things"? Why not leverage the wealth of data being collected from "things" to feed a Cognitive system which produces not only a list of potential failures with a confidence factor about their possibilities, but more importantly what if a Cognitive system provided insights into maintenance - where maintenance operations become individualized based on operational environments potentially saving millions in unnecessary maintenance, while also eliminating failures.
We're truly at an inflection point in the possibilities and opportunities to revolutionize our future.
fletchjibm 06000088KB 1.268 Visualizações
As I am sure many of you have heard by now, IBM is making a major investment in the Internet of Things (IOT). On Thursday April 9th, IBM will have a web streamed event around our announcement and what it will mean to our customers and the industry. Here's a link to the live event: IBM IOT Event Link.
With its strong capabilities in streaming and historical analytics, its IOT Foundation for data collection, its breadth of database capabilities, and its Bluemix and Cloud solutions IBM is well positioned as a major player for the Internet of Things. Couple that with the industry expertise that IBM has built over the years in areas such as Asset Management with Maximo, Smarter Cities, and Smarter Buildings. As real time information will become available, so many aspects of how our clients do their jobs will forever change.
And the change will start with the engineering. Engineering and operations will digitally integrate. Rich visualizations, coupled with augmented reality capabilities will improve safety, reduce maintenance, and enable workers to better perform their roles.
Its an exciting time, and I am so glad to be in the middle of the IOT revolution.
fletchjibm 06000088KB 1.171 Visualizações
There was a great article in Forbes recently talking about the Internet of Things and the impact it will have on Retailers. With solutions like IRIS from Lowes, connectivity to a wide range of "things" in the home is already. While starting around energy with thermostats, and lighting control, they are expanding their abilities. Just recently Orbitz announced an IRIS-enabled (z-Wave) water timer for example. Lowes also introduced their MyLowes card a year or so ago. While the public messaging was around helping you track your returns, the reality is that Lowes knows what you bought and when you bought it.
Think about the possibilities of combining the knowledge of what you purchased with the ability to connect and communicate. Lowes would be able to set the watering timer daily based on the type of plants you bought, the weather conditions, and the locale of your home. They might even add a moisture sensor to the plant to further complement the solution.
Look at all of the systems that can gather knowledge and interact, and be accessed by a human-driven smartphone today, but why not my indepth analytics and recommendations engines tomorrow.
The real value in the connectivity and the knowledge will come from the combination of knowledge. While everyone is focused on Nest and what Google is doing, Lowes is on a much path towards a deeper real knowledge of your home and everything about it, while adding real values for the consumer.
We're so early in the journey still. Many more players will emerge, and new technologies will be spawned, but the potential is endless.
fletchjibm 06000088KB 2.074 Visualizações
Building Information Modeling (BIM) is changing how "things" are designed. We're moving away from 2D real and digital paper to 3D designs that we can look at, change, and "virtually construct". Its unfortunate that BIM is called "building" because so many people think of it as only for "buildings", but it really applies to any "thing" we want to construct. BIM can be used to represent the finished building, but can equally be used to decompose a complex asset like a pump to virtually "look inside".
Even Wikipedia errs on the side of "buildings" with its definition of BIM saying:
Building Information Modeling (BIM) is a process involving the generation and management of digital representations of physical and functional characteristics of places. Building Information Models (BIMs) are files (often but not always in proprietary formats and containing proprietary data) which can be exchanged or networked to support decision-making about a place. Current BIM software is used by individuals, businesses and government authorities who plan, design, construct, operate and maintain diverse physical infrastructures, from water, wastewater, electricity, gas, refuse and communication utilities to roads, bridges and ports, from houses, apartments, schools and shops to offices, factories, warehouses and prisons, etc.
With the ability to digitally represent assets and even their subcomponents, the process of "handover" or "commissioning" is now able to change from one of complex manual (and error prone) tasks to one of automated integration of the design and operations phases. In fact, if we approached this the way it should be, we'd actually integrate the assets into the asset management system as they come online, and at commissioning the operator would have full knowledge of the asset history and maintenance. We'd leverage the BIM models created to construct the building, maintain them during operations, and integrate them into the operations process.
We've enabled the import of the BIM model into Maximo and the ability to incrementally update the information as changes occur. Its now possible to leverage the BIM model throughout the lifecycle of the asset. We can visualize the asset in 3D, drill into it, rotate it, look "inside" all through the power of BIM integration.
For some time I have been talking about the need for, and value of the introduction of IT technologies into the operation space. The reality is that the operational space is becoming more IT like in its technologies and implementations. Mechanical devices are being replaced by or augmented with intelligent processors and sensors. They are being connected to (either physically or logically) to networks. These formerly isolated mechanical systems are now becoming connected "IT" endpoints. They are now more susceptible to network attacks. They are now software devices, and will need software updates -- whether for maintenance or for new capabilities. With the replacement of the man with the clipboard, why limit readings to once a day - why not once a second? Why not monitor the systems in near real time (both the endpoint and the collector).
The possibilities are endless and the challenges are many but the reality is that the convergence is happening and will drive a fundamental change in how we architect, operate, and maintain these systems.
fletchjibm 06000088KB 575 Visualizações
We really don't think consciously about it, weather impacts every aspect of our life. It impacts our mood, what we wear, where and when we shop, what we eat, when we sleep -- the list is endless ...
So why don't we use weather more in our analytics and insights services to determine the real relationships and how to impact sales, ads, etc ... through the just announced availability of a range of weather APIs in IBM's Bluemix, anyone can now develop apps and services that integrate weather. This is just the start of a continued availability of services and solutions that integrate weather -- predictive, current, and historical - into our solutions.
Here's more info on the new announcement: https://console.ng.bluemix.net/catalog/services/weather-company-data-for-ibm-bluemix/
fletchjibm 06000088KB 743 Visualizações
Yesterday, IBM and Cisco delivered a significant announcement around the Internet of Things. The announcement talked about Analytics at the Edge, but perhaps what is more important was that it set the direction for "Analytics Everywhere". When I look at the Internet of Things and the connectivity and information processing models it will entail, it is a combination of most every model we have seen throughout the history of IT. There will be use cases that require "a mothership" as the control point, and master information source. There are scenarios where much of the processing can be handled "at the edge" with decisions and responses driven in isolation at the edge. There are scenarios that will require peer communications between "things" that are co-located or perhaps located across the globe from each other. There will be "cloud to cloud" information interchange requirements ... and security challenges at levels we have never imagined.
The "Internet of Things" is about business transformation, and the flexibility of being able to effectively and efficiently process information at whatever space is appropriate will create new new opportunities. Yes, we have seen some of these scenarios "in prior lives", but we have never seen the ability to develop and deliver the capabilities at the level of cost effectiveness we can today, nor have we ever considered the level of instrumentation that we can now do. Sensors are inexpensive and they enable a breadth of information that would have been unimaginable only a few years ago. Connectivity is nearly ubiquitous and inexpensive. Data storage costs are at levels where you can afford to keep data you aren't sure you will ever need, but might. And Analytics capabilities are becoming simple to deliver and available - with Watson Analytics being a great example of that evolution. #watsonIOT #IOT
fletchjibm 06000088KB 1.195 Visualizações
As is always the case with CES, exciting announcements are made. Today a new standard for Wifi was announced that should be available in 2018. Here's an article on it: Halow Wifi
The new standard focuses on improved range (up to 2x) and lower battery consumption. I have seen some articles focused around larger bandwidth hogs like file transfers, but as we look at IOT solutions, we're going to see a need for efficient connectivity, and exchange of small bursts of information. We could see apps that may have looked at BLE but now can leverage Wifi Halow instead.
Many have also positioned the technology more around the Smarter Home, but it would equally apply to a range of offerings as IOT will drive wide spread connectivity needs.
Lots of possibilities -- another great space to watch.
fletchjibm 06000088KB 1.269 Visualizações
The non-tech media has finally caught wind of the Internet of Things (IOT) and is highlighting what happens when you don't secure your solution. They are showing cars that have their braking and steering systems controlled remotely. They are talking about home automation systems being hacked with lights and heating being remotely controlled by the hacker. They are talking about home appliances being turned on and off without the homeowner intervening. Just today USA Today featured an article on Congress wanting to hold special workgroups around the area of IOT security -- USA Today Article
So while this makes for great news stories, the need for security in any connected solution has always been critical. In this modern connected world, the ability to hack in and take control will happen unless we do something to prevent it. Any IOT solution requires security at the connection level to assure that the control "pipe" is not hijacked and used for unwanted purposes. IOT security is also needed at "the thing" to assure that it cannot be taken over and controlled by an undesired source, such as a hacker. A good IOT solution would also place bounds on what "the thing" is allowed to do and actions when "things" are controlled outside of normal bounds. For example, who would preheat their oven for 8 hours -- the "smart oven" would reset itself if preheated for too long.
But there is also a new class of security needed for many of those "things". Location - is the "thing" where you thought it was. Has it been moved to a different location, so that the stop light you though you were managing is now no longer where you thought it was. It might sound far fetched but think of the implications?
The net is that we have to be concerned with security with any intelligent and connected solution. Whether being hacked with a USB device introducing malware, hijacking the device over the network, or simply signing in with a weak password (perhaps the biggest concern) the impact is real and we have to assure whatever we build, that security is a major focus of the solution.
fletchjibm 06000088KB 2.245 Visualizações
Nothing like titling a blog entry with three TLAs (thats three three letter acronyms). But when I look at technologies driving change, these two will drive a 180 degree change in how disciplines like Smarter Buildings and Enterprise Asset Management evolve.
Building Information Modeling (BIM) allows us to define, operate upon, and visualize "anything" ... and visualize it in context of everything around it. The Internet of Things (IOT) will allow us to collect real time operational information about everything ... and aggregate that information so that I understand the relationship between the "things" and derive patterns about how the things interoperate.
So now what do I have -- by integrating the ability to accurately visualize, drill in, rotate, look inside, etc etc etc and to understand how things are operating now and in the past, and predicting how they will operate in the future without ever having to physically touch the "thing" we're enabling an infinite realm of possibilities to improve operations, improve health and safety, and forever change how we do our jobs ...
Exciting times to come ....