January 9, 2020 By Eric Herzog 4 min read

Now that 2019 has ended, we anticipate incredible storage advancements to come in 2020. Storage is the essential foundation for all your application, workloads, and data sets. If your storage is not reliable, resilient, performant, and flexible, the value of your most critical business asset–your data–decreases dramatically.  Read on to see what is coming your way in 2020 to optimize that essential data foundation–your storage.

1. Storage for hybrid multicloud

The role of data has changed. With the advent of hybrid multicloud environments, businesses across the globe continue to see exponential growth in the amount and types of data they produce. Their long-term success depends on their ability to optimize these vast oceans of data seamlessly across cloud configurations.

In 2020, hybrid multicloud storage will support exponential growth across your entire storage estate, which most likely will be in a hybrid multicloud deployment. To make sure you are prepared for the hybrid multicloud world, you should be sure your enterprise can easily and transparently move data from on premises to your various cloud providers and back. The “cloudification” of storage will continue as enterprise businesses select the right storage for the right job, no matter the cloud environment. For this, hybrid multicloud storage has become a reality across all types of data and storage environments and will continue to be top-of-mind as enterprise business data continues its incredible expansion.

2. Acceleration and optimization of containers

As data and hybrid cloud grows, so, too, should investment in container usage. For businesses developing in hybrid multicloud environments, containers enable ease of data portability and movement across their organization. Some organizations have lots of containers, thousands in fact, creating a new “virtualization” and development layer.

At first, only developers used virtualization, but we have seen its use has spread across almost all businesses with the advent of VMware and other server virtualization platforms across the data center. Today, we’re seeing the same thing happen with containers. It’s no longer just for DevOps, but across for your data centers and clouds. As containers sweep into enterprise data center and cloud deployments, the discussion includes optimizing containers for persistent storage in your primary storage both on premises and spanning a hybrid multicloud deployment.

With containers moving more and more into the primary storage arena, the issue of how you are going to deliver modern data protection for your container environments gets more and more critical.  In 2020 we are expecting to see a rapid expansion of the marriage of storage and modern data protection to your container environments. As you look to use containers, make sure that you can optimize your containers for primary storage across all protocols—file, block and object–and that you are able to deliver the best in modern data protection for you containerized primary storage.

3. Storage for AI

AI workloads are growing dramatically and storage has become a critical foundation for AI success. The key to storage for AI is to have a single, vast repository of storage that is easily tied to AI and to your machine learning and deep learning assets. The single repository that can span physical data centers and cloud configurations must be able to support exabytes of data. Using a sophisticated AI data pipeline that encompasses data ingestion, organization, analysis, machine learning, deep learning, and archival, the modern AI business will see tremendous benefits as AI becomes commonplace across enterprises. As you search for your AI technology partner, it’s worth noting (humble bragging) that according to IDC IBM is #1 in global market share for AI, including services, software and physical infrastructure[1].

4. Cyber-resilient storage

As you refine your organization’s security strategy in 2020, remember that storage is a key element to your overall cyber resilience capabilities. It is not just about “if” you will suffer a security attack, but “when.” Most security strategies today center on the ability to prevent security breaches and, when one occurs, to solve that attack. However, as CIOs across the world will tell you, those attacks may take hours, days, and, even, weeks to remedy.

With such unlikelihood of getting your systems free from data theft, data corruption, malware, or ransomware, your storage infrastructure is essential to prevent the impact of cyber-attacks on your company and your company’s data. From encryption at rest and in-flight, to air-gapping, to malware and ransomware detection, to rapid recovery from a cyber incident, to multifaceted administration controls, your storage infrastructure must deliver the right technologies to create a holistic cyber security strategy for your corporate data.

5. End-to-end NVMe

In late 2018 and through 2019, the latest in high performance storage interface technology – NVMe – started to be incorporated into storage array solutions.  For example, IBM Storage delivered its first storage array NVMe solution in July of 2018. As 2019 progressed, the market was hearing more and more about NVMe over data center fabrics – fibre channel, ethernet and Infiniband. Throughout 2019 there was a lot of “kicking the tires” on the various versions of NVMe over fabrics, but not much actual user deployment. 2020 will likely be the inaugural year in which NVMe over fabrics starts to see real enterprise deployments.

6. Storage Class Memory

Storage Class Memory (SCM) is in its early phase of use as IBM and other technology partners begin shipping SCM to early adopters. The advantage is highspeed persistent storage for computing across complex datasets where performance is critical to your business operations. While 2019 saw lots of early announcements about SCM, 2020 will begin real deployments by end users. A common deployment scenario will likely be SCM in a hybrid configuration within an all-flash array.

Given the high price of SCMs, creating a hybrid array leveraging SCMs, all flash storage, and AI-based automated tiering (such as the Easy Tier feature in IBM Spectrum Virtualize) will provide the most cost effective, yet performant use for your high performance application needs.  Easy Tier places the data being used the most on the top tier (SCM), and data not being used gets shifted to less expensive storage.

In Conclusion

2019 was a year of incredible development across the storage landscape. As 2020 comes into view, we see little slow down of storage development. With storage being the essential foundation for your enterprise data and its applications and workloads, the storage industry is not standing still. New and incredibly beneficial storage technologies will emerge and deliver substantial value to your data and business. Want to know more details? Watch the video.

[1] IDC, IDC Market Share: Worldwide Artificial Intelligence Market Shares, 2018: Steady Growth — POCs Poised to Enter Full-Blown Production (Doc # US45334719, July 2019)

Was this article helpful?

More from Cloud

Enhance your data security posture with a no-code approach to application-level encryption

4 min read - Data is the lifeblood of every organization. As your organization’s data footprint expands across the clouds and between your own business lines to drive value, it is essential to secure data at all stages of the cloud adoption and throughout the data lifecycle. While there are different mechanisms available to encrypt data throughout its lifecycle (in transit, at rest and in use), application-level encryption (ALE) provides an additional layer of protection by encrypting data at its source. ALE can enhance…

Attention new clients: exciting financial incentives for VMware Cloud Foundation on IBM Cloud

4 min read - New client specials: Get up to 50% off when you commit to a 1- or 3-year term contract on new VCF-as-a-Service offerings, plus an additional value of up to USD 200K in credits through 30 June 2025 when you migrate your VMware workloads to IBM Cloud®.1 Low starting prices: On-demand VCF-as-a-Service deployments begin under USD 200 per month.2 The IBM Cloud benefit: See the potential for a 201%3 return on investment (ROI) over 3 years with reduced downtime, cost and…

The history of the central processing unit (CPU)

10 min read - The central processing unit (CPU) is the computer’s brain. It handles the assignment and processing of tasks, in addition to functions that make a computer run. There’s no way to overstate the importance of the CPU to computing. Virtually all computer systems contain, at the least, some type of basic CPU. Regardless of whether they’re used in personal computers (PCs), laptops, tablets, smartphones or even in supercomputers whose output is so strong it must be measured in floating-point operations per…

IBM Newsletters

Get our newsletters and topic updates that deliver the latest thought leadership and insights on emerging trends.
Subscribe now More newsletters