IBM recently announced Elastic Storage on Cloud , a software defined storage-as-a-service offering on IBM SoftLayer that provides high performance data management and storage in the cloud to organizations as well as seamless data transfer between their on-premise infrastructure and the cloud. Optimized for technical computing and analytics workloads, Elastic Storage on Cloud offers more storage capability at lower cost by meeting additional resource demands without needing to purchase or manage in-house infrastructure. With on-demand access to elastic storage resources, organizations working on high performance computing and data analytics such as customer data mining to determine across massive data sets, seismic data processing, risk management and financial analysis, weather modeling and scientific research are able to quickly adapt to changing business needs and get their products or research out of the door faster.
Elastic Storage on Cloud, born in IBM Research Labs, is part of IBM’s Software Defined Environment and Storage solution , which enables clients not just manage but exploit the increasing growth of data in a variety of forms generated by countless devices, sensors, business processes, and social networks. The software includes a set of capabilities that automatically manages data locally and globally, providing breakthrough speed in data access, easy administration and the ability to scale... [Continue Reading]
It’s been just a year after its acquisition; SoftLayer has become the galvanizing force behind IBM's rapid acceleration to cloud leadership. In July 2013, IBM bought SoftLayer for $2 billion and has continued to invest considerably in its cloud portfolio. Clients from all over the world are moving to the IBM Cloud, with thousands of new clients migrating to SoftLayer in the past year alone. Over the last 12 months, SoftLayer has gained 6,000 new customers.
In addition to new clients, more than 1,000 business partners have signed on to offer their services on SoftLayer. Organizations embracing SoftLayer range from leading global players such as Avnet, Arrow Electronics and Ingram Micro, to cloud-based services and solution providers like Mirantis, Assimil8, Silverstring, Clipcard, SilverSky, and Cnetric Enterprise Solutions. Large U.S. enterprise buyers (with more than 1,000 employees), ranked IBM as their top choice of service providers that they believe would be most effective in delivering IaaS for private and/or public clouds.
Also in IBM’s new cloud offerings designed to tackle big data, SoftLayer will play a key role in delivering IBM's rich data and analytics portfolio to clients faster, more effectively and efficiently. One of the most recent innovations include - Codename Elastic Storage is a new software-defined storage-as-a-service offering built on SoftLayer. It provides organizations... [Continue Reading]
Red Hat Inc. , a leading provider of open source solutions, recently announced the latest version of their flagship Enterprise Linux product - Red Hat Enterprise Linux 7. Positioned as the foundation for open hybrid clouds, Red Hat Enterprise Linux 7 enables Red Hat partners to more easily extend their technologies across physical, virtual and cloud environments without requiring any fundamental changes.
Red Hat Enterprise Linux 7 is significant for both Red Hat and IBM, and most importantly, for our joint clients. IBM and Red Hat have been partners for 15 years, building on a shared commitment to open computing, a common focus on enterprise quality computing, and active participation in joint open source community projects such as Linux , KVM and OpenStack . We are now going through an age of massive innovation, with new workloads like big data , mobile and social driving radical change in applications, and the new delivery models of private, public and hybrid clouds driving similar change in how those applications are delivered.
Linux sits at the center of all of these new workloads, enabler for cloud computing and providing the link between traditional systems of record and the new systems of engagement. The combination of Red Hat Enterprise Linux 7 and IBM systems and software delivers the flexibility, stability and new technologies needed to make this a reality. Red Hat Enterprise Linux 7 offers the flexibility of physical,... [Continue Reading]
I am writing this blog as one of the parts of a series where I am exploring open cloud-inspired approaches such as OpenStack , DevOps and open standards. The idea is to look at how these approaches can enable IT to adapt to the shift to user and customer engagement via social and mobile applications. In my earlier post, I looked at the tectonic impact that social and mobile applications, (explained by Geoffrey Moore and others as systems of engagement ) are having on the IT infrastructures and organizations. Moreover, I looked at how the requirement for agility in delivering these applications is putting pressure on IT operations and developers in many of the organizations I work with.
In this post, I will talk about the top four innovations that I believe are key to IT organizations to successfully utilize IaaS to deliver application services in this new landscape and to address the opposing objectives of operations and developers. Let’s take a look:
1. Management of pets and cattle. Today, the approach that many IT departments use to run applications and systems is financially unsustainable. An analogy I like, is thinking of IT systems as pets rather than cattle . Pets are treated with care and nursed back to health if ill—an approach that is applicable for customer resource management (CRM), enterprise resource planning (ERP), database systems of record and applications where data protection,... [Continue Reading]
Leading technology innovation happens in multiple environments from university research labs to Silicon Valley startup’s to, customers who push the limits of technology to adapt to their business models, and even the government. We need to congratulate National Institute of Standards and Technology (NIST) for taking an early leadership position in standardizing the definitions around cloud computing as the technology was making inroads into US Federal Government , working in collaboration with the industry. IBM is an active participant in defining and driving private and hybrid cloud standards adoption and evolving the NIST definition into an implementable reference architecture that not only considers - what and why of cloud, but also “how” operational integration with existing enterprise systems aligned to Information Technology Infrastructure Library (ITIL) and IT Service Management (ITSM) process.
IBM constantly evolves and refines the Cloud Computing Reference Architecture (CCRA) based on the changing regulatory and compliance needs (based on the solid security and privacy frameworks). IBM Cloud Computing Reference Architecture is intended to be used as a blueprint / guide for architecting cloud implementations, driven by functional and non-functional requirements of the respective cloud implementation. The CCRA defines the... [Continue Reading]
Today, organizations likely face the same challenges as many of our large complex accounts. Specifically, they would like to be in a position to anticipate market changes and shifts in customer sentiments or preferences while continuing to not only outpace the competition, but also disruptions in their space.
Companies employ strategies to deliver business value by leveraging the following technologies to engage customers:
Mobile – MDM and MADP (Mobile Device Management and Mobile Application Development Platform)
Big data – including NoSQL, which is sometimes referred to as not just SQL
The goal is to access applications and data from anywhere, globally. No matter the size of the enterprise, companies want to be nimble (if not the most nimble, at least nimble enough to be able to quickly respond to global business trends as they develop).
To do this, organizations need to tap into vast amounts of both structured and unstructured data to provide a competitive edge. The ability to instantly access information at the right time to make effective decisions means that organizations need to be able to manage larger volumes and greater variety of data at a velocity that allows them to stay ahead of trends. The goal is to move beyond intuition and instinct to gather and act upon information of all types (volume and variety), as... [Continue Reading]
There is a lot of hoopla out there about the cloud, especially since dozens of startups have gone public over the last year or two, and many more are in the queue. Here are what I believe to be the top five cloud predictions for the coming years:
1. More application availability on the cloud
With most new software being built for cloud from the outset, it is predicted that by 2016 over a quarter of all applications (around 48 million) will be available on the cloud (Global Technology Outlook: Cloud 2014: A More Disruptive Phase).
This makes sense when you consider that about 56 percent of enterprises consider cloud to be a strategic differentiator, and approximately 58 percent of enterprises spend more than 10 percent of their annual budgets on cloud services. The Everest Group, in their recent Enterprise Cloud Adoption Survey, further argues that cloud adoption enables operational excellence and accelerated innovation.
2. Increased growth in the market for cloud
According to Gartner, the cloud is here, and it is accelerating globally. Based on their forecast for 2011-2017 , Gartner expects adoption to hit $250 billion by 2017. In the fourth quarter of 2013, we saw this prediction supported by enterprises worldwide—enterprises that were increasingly relying on cloud to develop, market and sell products, manage supply chains and more.
In the same forecast, Gartner also suggested that the worldwide... [Continue Reading]
Today at IBM Edge 2014 - The premier event for infrastructure innovation, IBM Systems & Technology Group’s Vice President & Chief Marketing Officer, Surjit Chana, will host a special session called Edge Talks: Innovation that Impacts our World . The session will talk about innovations and how great ideas can help organizations move ahead. During EdgeTalks (the opening session of the Executive Edge at Edge2014), our Vice President along with several eminent speakers from TED Talks ® will discuss innovation that impacts our world and explore major global issues like food supply, health & wellness, and security. The session will also highlight thought provoking, bold solutions that resulted from daring to think differently and act differently.
The eminent speakers from TED Talks will be:
Ron Finley, the renegade gardener who transformed a Los Angeles food desert one urban garden at a time. (Twitter: @RonFinleyHQ )
John Wilbanks who will address the convergence of technologies that capture personal data and their uneasy connection with privacy laws and ethics. (Twitter: @Wilbanks )
Peter S. Singer, the author of “Cybersecurity and Cyberwar: What Everyone Needs to Know.” (Twitter: @PeterWSinger )
I am sure you don’t want to miss this session! Join us today May 19th 2014 at 3 PM PT in the Venetian Ballroom (Level 2) and see how thinking big and daring to be bold with... [Continue Reading]
When IT professionals talk about software defined data centers, the emphasis is typically on virtual servers — how they're created, provisioned, and maintained. But a truly smart software defined data center should be optimized as comprehensively as possible, across all resources. And among those, one of the most critical is storage.
How does the software defined data center pitch usually go? "Fluid allocation of resources in real time, to fulfill changing business workloads."
In the case of storage, making that fluid allocation happen means taking into account many factors. A short list would include virtualizing all storage, tracking the costs of storage utilization over time, determining the speed of reading/writing data to/from different storage tiers, reducing wasted storage as completely as possible, continually assessing how storage is used by different services and adapting in parallel, performing capacity management and planning tasks... and of course, reducing as much as possible total operational costs and management complexity.
That's quite a set of goals. And in a software defined storage (SDS) , those goals must mostly be accomplished automatically, for the data center to be as agile as possible in scaling to unexpected requirements. For any cloud model — public, private, or hybrid — these are major factors already, and they're becoming more important by the day. As data... [Continue Reading]
Recently, OpenDaylight become the first ever open source project to earn best of Interop 2014 award for its significant contributions to advancing IT. Interop , the most respected networking industry trade show, awarded OpenDaylight with the Best of Interop Grand Prize selected from nine category winners across key facets of IT. This caps off a year of accomplishments that have taken OpenDaylight from nascent underdog to one of the most important open source projects in the world.
One year ago at the Open Networking Summit , we announced OpenDaylight—a cross-industry consortium tasked with building an open source community and platform for Software-Defined Networking (SDN) solutions. OpenDaylight is the culmination of IBM's efforts to build an open platform that can provide the base for one of the three pillars of Software Defined Environments (SDE) alongside Software-Defined Storage (SDS) and Software-Defined Compute (SDC). The consortium launched with other major industry players including Brocade, Cisco, Citrix, Ericsson, Juniper, Microsoft and Red Hat.
While some people were enthusiastic from the beginning, others were understandably skeptical. Many of the companies involved were networking equipment vendors who might oppose a truly open platform that could rapidly bring sweeping changes to how we build networking infrastructure. During the announcement, Inder Gopal, IBM's Vice President of Network Development... [Continue Reading]
In my previous blog , I talked about the significance of multitenancy and how YARN (yet another resource negotiator) is an important development for organizations deploying Hadoop environments. Here I will discuss about some of the advance and most effective multitenant distributed computing solutions offered by IBM that help our customers build a true multitenant infrastructures. In addition to standard capabilities in Hadoop, IBM offers a solution called IBM Platform Symphony profiled by Gartner Research here . IBM Platform Symphony was designed from day one to support multitenancy with all the capabilities that this entails including security isolation, service level guarantees, chargeback accounting and dynamic resource sharing subject to policy. While IBM Platform Symphony supports your chosen Hadoop distribution along IBM’s own Hadoop offerings, we take a broader view. IBM Platform Computing supports not only Hadoop workloads, but a huge catalog of additional applications as well – service-oriented applications, batch workloads, process-oriented workloads, MPI applications as well as various long-running service frameworks.
IBM has been delivering multitenant distributed computing solutions for over ten years. We are proud that 12 of the top 20 global investment banks have deployed IBM Platform Symphony to manage their multitenant infrastructure. If you are struggling with... [Continue Reading]
Among tech topics that generated most buzz at the recently concluded Red Hat Summit in San Francisco - cloud , software defined infrastructures and open source stood out. Leading experts in the industry shared valuable insights on the vast opportunity, business value, and competitive advantages of these technologies.
In one of the discussions, Scott Firth , Director - IBM Software Defined Environments (SDE), delivered insights on the many facets of cloud, software defined and open source including their respective value propositions, implications on IT infrastructure as well as IBM’s next move around these technologies. The discussion was led by SiliconANGLE’s John Furrier and Wikibon’s Stu Miniman inside theCUBE from the floor of Red Hat Summit 2014. Here are some of the key excerpts of the conversation:
♦ The discussion started with Scott’s comments on the IBM’s strategic decision to invest in Linux back in 1999 when it was still in its infancy stage and IBM’s outlook on open source technologies today
♦ Scott (with IBM for more than 30 years) emphasized some highlights of the long-standing IBM-Red Hat alliance, starting with solutions for Linux applications running on thousands of Linux Virtual Machines on the mainframe, to performing data analytics on Power Systems and Intel-based systems.
♦ On the cloud and open source front, Scott... [Continue Reading]
Time is running out to register for IBM’s Edge2014 – the premier event for infrastructure innovation, taking place May 19-23 at the Venetian in Las Vegas. Edge 2014 will deliver unparalleled technical education on the latest technologies and trends that are defining this new era such as cloud , big data , software defined environments , mobility , security and more. The event brings an exclusive opportunity for you to network and participate in forward-thinking discussions with IBM product developers, executives, industry experts and more than 5,500 of your peers.
At Edge2014, we will showcase latest IBM enterprise-level technologies via hands-on labs, demos and workshops. Leading technology experts from across the globe will share knowledge, best practices through more than 550 sessions in 14 exclusive technical tracks.
Download the Technical Edge guide to get the full session descriptions and track information.
As an IT leader, you’re invited to attend Edge 2014 infrastructure innovation event. (The early bird registration ends April 20th) . Register today and be a part of this incredible opportunity. We look forward to your participation. For more updates, visit ibm.com/edge or follow @IBMEdge on Twitter .
IBM Systems & Technology Group... [Continue Reading]
In today’s world, the incessant data growth is challenging traditional storage and data management solutions. These outdated systems lack performance and are expensive to administer and scale. As a result, business outcomes are impacted. In order to quickly and cost effectively respond to the staggering growth of data, there is a need for a scalable, high-performance, reliable and collaborative storage and data management infrastructure.
IBM’s General Parallel File System (GPFS) is a proven, scalable, high-performance data and file management solution that’s being used widely across multiple industries. GPFS provides simplified data management and integrated information lifecycle tools capable of managing petabytes of data and billions of files, in order to arrest the growing cost of managing ever growing amounts of data. GPFS removes data related bottlenecks by providing parallel access to data, eliminating single filer choke points or hot spots.
IBM GPFS ensures end-to-end data availability, reliability & integrity and also simplifies data management at large scale by providing a single namespace that can be scaled easy on demand by simply adding additional compute or storage resources. Conventionally deployed for high performance computing (HPC), today GPFS has gained market share in non-HPC solutions like Healthcare & Finance and supporting high available scale-out relational databases, helping... [Continue Reading]
The vision of the software defined infrastructure (SDI) is to deliver virtualized capabilities to the entire set of resources required by the application so they can be deployed automatically and quickly with little to no human intervention. Storage is one the major building blocks in accomplishing the software defined infrastructure vision. In order to achieve the SDI vision, it’s critical that storage hardware and software architectures must adapt effectively so that storage can be provisioned and responsive to the dynamically changing requirements of the SDI. The flash technology is positioned as a key enabler for these new storage architectures and with the right combination of hardware and software; facilitates efficient, cost-effective and high-performance storage services delivery. Flash-based storage improves I/O performance and efficiency for many applications like database acceleration, server & desktop virtualization and cloud environments. Flash storage has become a way to compress data, reduce power, and increase performance making it a superior enabler of virtualization and a perfect fit for the SDI vision.
Recognizing the growing importance of flash in a software defined infrastructure, IBM is offering end-to-end technical education sessions on flash technologies at Edge 2014 from May 19-23 at Venetian, Las Vegas.
At Edge 2014 – the premier event for infrastructure innovation,... [Continue Reading]
There's a lot of buzz about cluster file systems being the ultimate Software Defined Storage (SDS) . I couldn't agree more. I like to think of cluster file systems as the Swiss army knife of storage -- many styles of storage possible with one cluster file system product -- high IOPS, high streaming bandwidth, or cost effective styles all possible, with high availability, high elasticity at the same time.
Cluster file systems can leverage cost effective, commodity storage rich servers, and the wealth of Hard Disk Drive (HDD) and Solid-state Drive (SSD) storage options they offer, to create enterprise-grade storage. IBM's General Parallel File System (GPFS) is a mature, robust, award winning product, well known for performance and scaling in some of the world's largest supercomputer installations, that makes a great software defined storage subsystem for many, many applications.
IBM recently published an online book titled Software Defined Storage for Dummies , which shows the dramatic capabilities of GPFS-as-SDS and many real world applications. The Software Defined Storage For Dummies, IBM Platform Computing Edition , not only examines data storage & management challenges but also explains how software defined storage delivers an innovative solution for high-performance, cost-effective storage using IBM’s GPFS.
It’s... [Continue Reading]
Cloud has emerged as the growth engine for business. A recent survey of enterprises showed that those who adopt and leverage cloud computing for competitive advantage on average grow twice as fast and double their profit. Building an effective cloud environment requires a disciplined approach. Standards (both formal and informal) are rapidly evolving to ensure portability, interoperability (ISO/IEC JTC1/SC-38, OASIS TOSCA, IEEE P2302 etc.), and manageability of the cloud environment. IBM is well-positioned to offer end-to-end cloud solutions based on open architectures that deliver both interoperability and value to our customers across the world. Recently, IBM made major developments in the cloud. From acquiring SoftLayer last year to its commitment to invest $1.2 billion , IBM Cloud is creating exceptional opportunities for enterprises to transform business models, supply chains, and their interactions with customers and partners. Innovation around IBM's open cloud architecture is dramatically changing the entire digital fabric inside, and outside of the data center. One of them is Codename: Bluemix (Open Standards based Platform as a Service). Introduced at Pulse2014 , Bluemix is a new IBM PaaS offering based on Cloud Foundry , is an open-standards, cloud-based platform for building, managing and running apps of all types (web, mobile, big data, new smart devices).
We are seeing a lot of innovations by IBM in the... [Continue Reading]
After the successful accomplishment of Pulse2014, we’re ready to host yet another grand event – Edge2014 . Starting from May 19-23 at Venetian Las Vegas, IBM Edge2014 brings together IT professionals and practitioners from all industries to sharpen their expertise, discover the latest technologies, and share best practices in infrastructure innovation. The premier global event for infrastructure innovation will be focused on simplifying IT infrastructure and accelerating performance to deliver business value by using IBM’s cloud , big data , mobility and security solutions.
Let’s take a look at some of the key highlights of Edge2014:
550 expert technical sessions across 14 tracks
Exciting technology innovations and announcements
Latest and greatest business partner education
A comprehensive and expanded showcase Solution Center giving you access to the latest storage, System x and PureSystems solutions from IBM and our sponsors
Top-notch entertainment for over 5500 attendees
Wait, we have more in store for you...
The top 5 reasons to attend Edge2014:
Be the first to learn about the latest IBM technology innovation and their business implications
Be the first to see newest IBM solutions and live demos at the Solution Center
Hear real-life customer case studies and best practices tailored to CxO and Line of Business (LoB) issues and... [Continue Reading]
Many organizations are wrestling with the economics of cloud computing . This is especially true in High Performance Computing (HPC) and analytics where applications often demand clustered, scaled-out infrastructure. These types of workloads are often “spiky” or unpredictable and the costs associated with infrastructure can be substantial.
As a few examples:
A life sciences firm may need compute capacity only at particular stages in the drug development lifecycle
An engineering firm’s workload may vary depending on their active contract portfolio or the specific nature their projects
An insurance firm may require large amounts of computing power to meet regulatory reporting obligations but only for brief periods at month or quarter end
Provisioning infrastructure to meet periodic peaks is costly. Ideas like peak-shaving, out-sourcing and hybrid clouds are not new but organizations seeking to leverage public Infrastructure-as-a-Service (IaaS) offerings can run into a variety of technical and business challenges.
How to guarantee quality-of-service (QoS) in multitenant environments
Data management and security
How to manage, meter and throttle the usage of variable cost resources
How to manage commercial software licenses
How to ensure that local assets are fully utilized before tapping assets in the cloud
These business... [Continue Reading]
Businesses large and small are turning to Hybrid Cloud because it unites the best of all worlds - Public Cloud, Private Cloud and dedicated servers working together in any combination. A Hybrid Cloud combines the use of on-premises cloud infrastructure with cloud service provider infrastructure to create a firm’s overall capacity from the combined capabilities. While a hybrid approach promises cost savings and significant gains in IT and business flexibility, some concerns remain around how to manage, secure and integrate on-premises infrastructure with cloud services in hybrid cloud architecture…
Identifying these imperatives, we bring you an exclusive Video Debate session where top IT experts will discuss on when the hybrid clouds are and are not the be-all and end-all and if the infrastructure supporting the hybrid clouds really matters. On Tuesday, March 11th from 11 am-12 pm EDT, the fifth installment of our What’s Next for IT Infrastructure Video Debate will also discuss about the developments in cloud , data , security and many other latest topics and trends that influence your IT infrastructure.
The debate session will be moderated by Kevin Jackson , founder of GovCloud and author of GovCloud: Cloud Computing for the Business of Government . Among the panelists, we will have Steve Strutt - CTO for Cloud Computing, IBM UK and Ireland, Laura DiDio - Principal of ITIC, and Michael A. Salsburg -... [Continue Reading]