Archive

Backup to the cloud: Good idea or recipe for disaster?

Share this post:

Backup has always been a challenge in the enterprise, especially with the recent growth in data volumes. This data volume growth is not going to slow down anytime soon. The cloud has emerged as a popular alternative for data backup, especially in the small and medium business (SMB) market.

Backup to a cloud-based backup solution does however create some challenges that need to be considered when evaluating this as a serious alternative for your backup needs. Let us take a closer look at what your challenges might be when looking for a backup solution on the cloud.

Data policy

First of all, you will need to consider local data rules. Are there policies that might constrain where data or certain types or data may be stored or transmitted? Certain types of data require distinct protection depending on jurisdictional policies. For example, any personally identifiable information is subject to regulations within the European Union member states and many other countries and US states. This type of data requires extra controls that are mandated by the regulatory authorities. Also, one should consider the data policies, especially the security practices, of the provider to determine if a public or private solution best fits the needs of your data. In a multi-tenant environment, for example, you will want to know if there is any risk that your data can be made available to other tenants, either by chance or by malice?

After the data policy issues are sorted out, you must consider the other requirements that you might have to the solution, both from the technical and the business side.

Business perspective

From the business perspective, there are all sorts of requirements or considerations that need to be factored in to the equation.

Going to the cloud, or outsourcing, the backup, or any other IT function for that matter, often comes down to the cost issues. Can someone else using other technologies or other means of reducing the total cost of the service provide the backup for a lower cost? However, reducing this cost, or moving the cost from a capital expenditure (CAPEX) to an operational expenditure (OPEX) on your budget might entail a shift in how the cost is calculated. The elasticity and metering nature of the cloud introduces a new factor to the cost picture: Flexibility. This flexibility can come at a challenge however: is the cost predictable, or are there any factors that can create an unpredictable cost scenario?

There are currently multiple manners of pricing a public cloud backup solution in the market. The dominant three are:

  • Cost calculated per GB streamed into the data center of the backup provider. This model is done either with compression turned on from the client side or turned off. This scenario does not take into account the compression ratios that can be achieved in the client software, because the only metric available is the data volume streamed or stored. This streamed data volume can also be calculated based on the physical storage used on the backup infrastructure.
  • Cost calculated based on the source data volume backed up. This model takes into account only the source data volume and does not factor in the compression possible in a visible manner to the client.
  • A mix of the two previous models.

In addition, there is usually a cost relating to the retention of the data because this varies greatly between the customers. Bear in mind that the customer in some cases cannot influence the retention interval, because this parameter is set from the provider. It is important that the retention schedules that you need can be provided by the supplier as a deviation from your requirements can have large implications.

Because there are different cost models, it might also be difficult to compare the price models, so the one that looks like the costlier might actually be the cheaper one. As an exercise, let us assume the following situation. Provider A charges for the amount of data stored on the backup solution, compression ratio of 2:1. Provider B charges for the amount of source data backed up. Let us assume the charging period is for four weeks (approximately one month) and Provider A charges USD1 per GB and provider B charges USD2 per GB. The data volume is 10 TB, and we assume that one full backup is performed then incremental backups are done for each following day for the whole month. The calculation then becomes:

Provider A Provider B
Week 1 9500 GB 10000 GB
Week 2 5250 GB
Week 3 5250 GB
Week 4 5250 GB
Total volume 25250 GB 10000 GB
Total cost  per month USD 25250 USD 20000

As we can see from this calculation, even though provider A initially seems like the cheaper one, provider B is actually cheaper. An exercise like this should be performed with the actual numbers for your company in order to determine which one is the best fit for you.

You also need to consider the contracts that the provider offers:

  • What are the SLAs that are offered? What is covered within these SLAs? Consider if the proposed SLAs are sufficient for your needs or if additional agreements need to be put in place.
  • How is the data handled if a provider goes out of business or if there is a conflict between you and the provider? Are you able to retrieve the data backed up in these cases?
  • What reports are available and will be provided by the provider?

Technical perspective

On the technical side, there are the requirements and constraints that you are already familiar with if you have an existing backup solution running (and I sincerely hope that you have one).

What are your recovery time objective (RTO) and recovery point objective (RPO) requirements? In a traditional backup solution these might be hard to achieve due to network bandwidth and similar limitations. In a public cloud scenario, these requirements might be even harder to achieve because WAN and Internet bandwidth is, in most cases, a more limited resource than a local network. Is it possible to reduce the impact of this limitation or remove it altogether by installing a caching device or introduce WAN optimization?

In addition, there are other technical requirements such as licensing of software, maintenance of software releases and so on, but these are the same as in a traditional backup scenario, so I will not go into further details on these.

IBM and the cloud backup

IBM introduced a backup solution under the IBM SmartCloud brand named IBM SmartCloud Managed Backup and provides this either as a standard offering or as a custom tailored offering with the possibility to install parts or all of the backup infrastructure needed in your premises. For further information, see the IBM SmartCloud website.

Conclusion

If you manage your requirements and business needs correctly, backing up your data to a cloud-based solution can provide you with a flexible solution that performs as expected. However if you do not manage these requirements properly and simply assume that a cloud-based solution is the same as a traditional solution, you are heading into a possible hornet’s nest.

More stories

Why we added new map tools to Netcool

I had the opportunity to visit a number of telecommunications clients using IBM Netcool over the last year. We frequently discussed the benefits of have a geographically mapped view of topology. Not just because it was nice “eye candy” in the Network Operations Center (NOC), but because it gives an important geographically-based view of network […]

Continue reading

How to streamline continuous delivery through better auditing

IT managers, does this sound familiar? Just when everything is running smoothly, you encounter the release management process in place for upgrading business applications in the production environment. You get an error notification in one of the workflows running the release management process. It can be especially frustrating when the error is coming from the […]

Continue reading

Want to see the latest from WebSphere Liberty? Join our webcast

We just released the latest release of WebSphere Liberty, 16.0.0.4. It includes many new enhancements to its security, database management and overall performance. Interested in what’s new? Join our webcast on January 11, 2017. Why? Read on. I used to take time to reflect on the year behind me as the calendar year closed out, […]

Continue reading