Why cloud ought to be part of your archival and tiering strategy

Share this post:

cloud tiering archival strategyStorage is a perennial top challenge for IT teams.

Capacity demands continue to increase, while budgetary pressures make it hard to sustain even an insufficient status quo. It would be great if new storage was devoted to business-value-creation processes, but far too much is consumed by stagnant data on primary systems and myriad inefficient secondary platforms.

Organizations of all sizes should be rethinking their broader storage strategies, focusing on achieving a flexible and durable “retention repository,” which can be a key future-proofing tool in a variety of solution scenarios:

  • Backups. These are the foundation of most data management and data protection strategies. Organizations should look for durable and efficient repositories for “protection storage.”
  • Archives. While backups create previous versions in preparation for recovery, archives often provide the “copy of last resort” for regulatory compliance, eDiscovery preparedness and operational reference. That said, not all data needs to be preserved. As such, archival solutions usually retain only a subset of production data based on the business or governance implications of that data. Archives are associated with a variety of implementation strategies, specifically:
    • “Warm” or “active” archives provide near-production levels of access performance to predominantly dormant data that needs to be immediately available upon request.
    • “Cold” or “deep” archives typically have the longest retention requirements and the lowest expectations for responsiveness during retrieval, although many organizations still expect retrieval in minutes to hours for even the coldest data.
  • Storage tiering. Archival strategies are typically driven by regulatory considerations or the operational governance needs of the data itself. However, many organizations simply want to manage all of their data within a unified storage stack that automatically and transparently moves data across different grades of storage (tiers) based on access patterns, so that the most recent, frequent, or critical data is on the fastest, or “hottest” storage tier, and other data seamlessly moves and is stored on more scalable and less expensive “colder” tiers.

It is important to note that these strategies and solution scenarios are seldom or never achievable using only disk‑based architectures. A broader strategy that combines disks (HDD and SSD) with complementary media including tape and cloud should be embraced.

And speaking of cloud, ESG’s annual IT spending intentions research shows that a repository for backup and archive data is consistently the most commonly cited use case for cloud-based infrastructure among survey respondents.

Even so, not all clouds are similarly suitable when it comes to housing what could be the copy of last resort for data that may be required for regulatory compliance, eDiscovery, or adherence to operational retention and usage mandates.  Three key aspects to consider are: durability/availability, agility and security.

I go into much more detail on this topic in a new ESG paper, in which I explain the storage challenges organizations are facing, the characteristics of cloud-based storage solutions to consider when trying to address those challenges, and how the IBM Cloud Object Storage service addresses the requirements.

Read the whitepaper here.

Add Comment
No Comments

Leave a Reply

Your email address will not be published.Required fields are marked *

More Storage Stories

New data center solutions help simplify public cloud migration and speed private cloud deployment

Among the bevy of new, modernized data center solutions unveiled by IBM this week are two designed to help organizations deploy public clouds more quickly and migrate data to and from the IBM Public Cloud with greater ease. Spectrum Virtualize not only enables simplified migration to the public cloud, it also helps with disaster recovery. […]

Continue reading