Software Defined Environments Q&A: Featuring Subject Matter Expert – Matt Hogstrom
IBM Software Defined 2700052JD4 Visits (3588)
Matt: Software-defined at its core means a programmatic access to infrastructure. Over the past few years, the industry has shifted from traditional IT to an API driven consumption model of infrastructure. At IBM, we are considering the consumers, as well as the providers of programmatic infrastructure and use the term Software Defined Environment (SDE) to refer to the architecture. The Soft
Matt: SDE is not hype at all. I think it’s a rapidly evolving reality. Infact, software-defined evolves more out of the realization by the industry of how the consumption model has changed. When we talk about the attributes of SDE, one of its most significant attributes is velocity. Customers are expecting to consume infrastructure resources on demand quite honestly. Today the velocity of IT consumption in a business is much more instantaneous. The second attribute is uniformity. Today consumers are less concerned about the particular devices on the back-end. They are not concerned if it’s a converged storage infrastructure or if it’s a V7000 back-end, etc. But they are more concerned about they just need a block storage if its 20 terabytes or 50 terabytes. An SDE brings uniformity by automating, standardizing and integrating end-to-end IT infrastructure. The third attribute is really more focused on IT providers rather than building unique systems, networks, storage infrastructures for applications like we used to do in the past. We have got the infrastructure providers looking for how do they take large pool of resources and put them together in a way that they can deliver programmatically to the consumers. So from that standpoint, it’s changing the way IT providers are thinking about how their engineering systems are on the back-end.
Matt: That’s not correct. There is always an assumption that there is hardware somewhere that’s going to host these containers, whether they are the operating systems, hypervisors, etc. This implies that the hardware is there. However, the difference is that there is less concern about how that hardware is hosted and more concern about how I consume that hardware. So again, it goes back to the idea of SDE that automatically orchestrates all the infrastructure resources (compute, storage, networks and management) to meet workload requirements in real-time. So we can say that in a Software Defined Environment, infrastructure matters.
Matt: The concept of SDE consumption is fairly mature. I think the movement is well underway. One area which is not fully settled is on whether the consumption APIs have to be uniform or not. Many vendors including IBM have its set of APIs for developing and deploying applications. In order to be as effective and efficient as possible, I believe having some kind of uniform way of accessing these different pools of resources is going to be important. We think OpenStack is one of the means of leveling those consumption models. We are also seeing customers are asking themselves questions about how to take large pools of data and analyze them to gain insights. Companies are in various stages of embracing and exploiting SDE, they all are looking at it in a way to complete tasks faster and more agile than they have done in the past.
Matt: Today, customers are more concerned about the nature, utilization and placement of data because the data at its very core is one of the most significant parts of SDE. Data is not something that is easily moved from one location to another. As such, data is a significant consideration in how the software-defined consumption model is going to work because the farther you get away from the data, the more latency you introduce to a running workload. Determining the value, sensitivity and placement of the data, etc. is one of the challenges in organizations moving to SDE. I think we have moved into a next chapter of SDE where we are going to see a lot more innovation around data location both in terms of performance as well as legal and other restrictions on placement.
Matt: Yes, absolutely. Software Defined Environments are optimized to deliver the agility, efficiency and performance needed for today’s workloads. Cloud is itself a SDE. Software Defined Environments help cloud service providers offer the most efficient and scalable cloud solutions, even the analytics. Let me quote an example here. Recently I was listening to a customer talking about data and the sources. They were looking at taking data from a lot of sources like close circuit TVs and using that data in real-time, analyzing it for facial recognition, for security purposes, determining locations and even how to associate discounts with customers. That’s a tremendous amount of data, their data is coming from a particular locality, then they are going to stream it in, set analytics on it. That’s a very unique kind of data analysis application so it’s not all about applications that evolve web UIs. So analytics is how I quickly take that data and quickly act on it, hence making SDE a critical part of how they structure their environment. In Mobile, SDE is the next source of transaction and requests as in worldwide, the number of devices is exploding and even looking at that data and drawing tons of information out of it. SDE for Social helps in getting sentiment analysis, understanding what customers are thinking, what is happening in the world, how can we make an opportunity out of data. So I believe you can’t really do any of the CAMS workloads without SDE. If you don’t have that, your ability to react in a timely fashion is going to be limited. As a result, it will make actions irrelevant because by the time you get something stood up and running, it’s too late and the opportunity will be missed.
Matt: From the Security standpoint, it’s in the evaluation of data and how sensitive it is. As new applications are coming online, new data sources are identified, and the processing of those data sources is evaluated. We are going to see some interesting innovations in the storage space in terms of being able to tag and identify data, identify its location geographically as well as the ability to move the data from one place to another. So the security concerns there come down to where it is located, how I manage it overtime, how I identify it, and how I maintain it in the audit trail from the governance standpoint. I think that continues to be an aspect of how data evolves overtime.
Matt: At IBM, we have traditional ways of delivering compute, network and storage on-premise as well as means of delivering it programmatically via APIs in a hosted environment like at SoftLayer. We acknowledge with the industry shift we need to be able to provide those IT resources through either channel whether it’s an on-premise or off-premise solutions. One of the significant efforts that we have made to help unify is we’ve standardized on OpenStack as our consumption model. We believe Open is a critical aspect of SDE, it’s really more in order to consume and use resources across different types of environment as you have to be able to get a common language to do that. And we believe OpenStack is the language to be able to serve this up. We’re also focusing heavily on delivery of services or SoftLayer as a service first on the public web via SoftLayer in addition to providing packages that live on-premise. So we’re changing the way we deliver code and we are also following the trend toward improving velocity by moving software releases much more aggressively that we had in the past when the market was a traditional IT. We are now much moving towards an aggressive delivery model that moves faster and faster in terms of how we deliver new capabilities.
Matt: IBM was the impetus behind OpenDaylight and is very active in Soft
Matt: I think the interesting lessons learned as we move forward is SDE is all about a programmatic infrastructure and we very often associate programmatic infrastructure with virtualization, which has been a critical technology over the past couple of decades in getting efficiency out of the environment. One of the evolving areas which is very interesting is the use of container-based deployments around packaging message like docker, etc. This technology has been around with us for many years. I think the ability to leverage containers is its deployment mechanisms that really help customers with two significant advantages: 1) Containers are much faster than traditional virtual deployment machines, so you can get greater density of applications of particular physical footprint, which drives efficiency and reduction in your capex and opex. 2) They are smaller, lighter weight, they start and shut-down faster. And so it’s very consistent in this over-arching idea in SDE, focusing on the velocity of consumption infrastructure. The containers also change the way we think about hypervisors. May be hypervisors are less relevant going into the future as we move to containers in lighter way consumption models because our goal is to move faster, to be nimbler, to react in market quicker, to get insights faster and any new technology that helps us to do that in a way that is effective and efficient is going to be embraced. I think looking at the container deployment model is one of the next ways of SDE as we go forward!