Accessing the data you need on YOUR terms

Share this post:

It’s often said that it takes more effort to make a new client than to retain one. Today, however, it is taking a lot more effort just to retain clients, especially with new competitive offerings just a click away.

With that in mind, you need to react quickly to keep your clients engaged and retain their interest in your offerings. To do that, your business and development teams need to react faster, leveraging current data to understand client needs and how they want to engage with your business. This will typically require involvement from both your sales and marketing teams as well.

Now add on these additional challenges: recent updates to your data, large volumes of data, and the need to gain access to it from your IT organization. Imagine that you have a large data set (in multiple terabytes) and want to use a copy of it for non-operational business use. Just to make a single copy, you would typically need to:

  • Check with your storage administrator to make sure you have the capacity to make the copy.
  • Make a request (or a purchase) to obtain an allocation of the storage and have it assigned to your host.
  • Speak with your OS admin to make sure they can see the capacity and access it.
  • Have a copy of your production environment placed on the new storage (which may require wait time while the copy is being created).
  • Speak with your DBA (if it was a database) and have them configure/activate the copy of the database, ready to receive queries.

With many organizations, this typically entails a minimum of 3 steps involving 3 different groups of people; each with their own challenges, pressures and change control, on top of the cost of having spare storage capacity. Going through the process to get the data you need can take days or weeks, when your needs are much more urgent.

Now imagine the same scenario, except that you can get a copy of the data you need within minutes, regardless of the size or volume, utilizing the existing spare capacity in your existing controller. In fact, you don’t need like-for-like capacity, you just need enough to store the changes to the source while the copies are active. What’s more, if you want multiple copies to give to different groups within your organization (sales/analysis/developers) that can be done too – again, without the need for 100 percent of the capacity each time for each copy.

The technology key to making this happen is IBM Spectrum Copy Data Management. When you combine this offering with your existing supported disk technology (or with IBM Spectrum Virtualize if your technology is not supported), it can help your users (within the limits that you define) to make and create copies of business applications on their terms whenever they need them.

With this technology IT can “manage” the infrastructure, control who uses it and how much they can use, and the user can make or create copies when they want to within those limits. IT can even enforce that a copy they receive has been “masked” so that private information is not meaningful.

When the copy is created by the user, they don’t need to understand all the steps occurring behind the scenes. They can create their copy and start using it without being, or talking to, a DBA, systems admin, storage admin or a network admin. In fact, if pushing a button to make a copy is still too complex, the system can do it programmatically at the same time of the day each day or week.

According to analysts, making copies of data (copy data management) is a $50 billion dollar problem for organizations. If making copies of data is a pain point for your organization, please learn more here about our IBM Spectrum Copy Data Management offering.

More Storage stories

Powerful new storage for your mission-critical hybrid multicloud

Flash storage, Multicloud, Storage

Cloud computing and mainframes processors are two of the most important contributors in information technology solutions. They are increasingly being linked together to unlock value for clients. 85 percent of companies are already operating in multicloud and by 2021, most existing apps will have migrated to the cloud. [1] At the same time, mainframe utilization more

A conversation about metadata management featuring Forrester Analyst Michele Goetz

AI, Big data & analytics, Storage

I recently had an opportunity to speak with Forrester Principal Analyst Michele Goetz following a research study on the subject of metadata management conducted by Forrester Consulting and commissioned by IBM. Michele’s research covers artificial intelligence technologies and consultancies, semantic technology, data management strategy, data governance and data integration, and includes within its scope the more

IBM is transforming data storage for media and entertainment

Data security, Storage, Tape and virtual tape storage

Disruptions in media and entertainment have been occurring at a rapid pace over the past few years and many organizations are still struggling with optimizing their IT infrastructure for new data requirements. With growth in capacity, new larger file formats, keeping more data online for business opportunities, and the growing use of AI, organizations are more