October 12, 2021 By Chris Saul 3 min read

It’s clear that businesses are embracing hybrid cloud. Indeed, a recent report from the IBM Institute for Business Value states that 97% of business are piloting, implementing or integrating cloud in their operations.1

However, as hybrid cloud environments become the norm, businesses must contend with additional IT complexity, public cloud costs and threats from cyberattack and other data destructive events.

Today, IBM® is announcing new capabilities and integrations designed to help organizations reduce IT complexity, deploy cost-effective solutions and improve data and cyber resilience for hybrid cloud environments.

Extending hybrid cloud storage simplicity to Microsoft Azure

One way to reduce hybrid cloud complexity is to ensure consistent function, APIs, management, and user interface across on-premises and public cloud platforms. This can be accomplished with software-defined storage (SDS). That’s where IBM Spectrum® Virtualize for Public Cloud comes in. It’s the cloud-based counterpart of the software in IBM FlashSystem® and SAN Volume Controller.

IBM Spectrum Virtualize for Public Cloud provides the same storage functionality in the cloud as you find on-premises, which makes it easy to implement hybrid cloud storage scenarios such as disaster recovery, cloud DevOps, and data migration. And since it provides this same function across clouds, it also makes it easy to use multiple clouds or to move from cloud to cloud.

Now we’re extending our cloud support to Microsoft Azure in addition to IBM Cloud® and Amazon Web Services (AWS).

On Azure, IBM Spectrum Virtualize for Public Cloud supports IBM Safeguarded Copy, which automatically creates isolated immutable snapshot copies designed to be inaccessible by software – including malware – and which can be used to recover on-premises or cloud data quickly in the event of a data destructive event.

Expediting Turbonomic integration for automated operations

IBM recently acquired Turbonomic, an Application Resource Management (ARM) and Network Performance Management (NPM) software provider. By acquiring Turbonomic, IBM is the only company that will be able to provide customers with AI-powered automation capabilities that span from AIOps to application and infrastructure observability.

IBM and Turbonomic plan to deliver rapid benefits for customers of IBM FlashSystem by improving application performance awareness and automation.

  • Turbonomic will collect information from IBM FlashSystem storage including storage capacity, IOPS, and latency for each storage array.
  • Turbonomic’s analysis engine combines FlashSystem data, virtualization data and application data to continuously automate non-disruptive actions and ensure applications get the storage performance they require.

This can eliminate the need for unnecessary over-provisioning and safely increase density without sacrificing performance. On average, customers can increase density by 30% without any application performance impact.2

For environments using Instana®, Red Hat® OpenShift®, or any major hypervisor (such as VMware vSphere) with IBM FlashSystem, Turbonomic will observe the entire stack from application to array. This enables all operations teams to quickly visualize and automate corrective actions to mitigate performance risk caused by resource congestion, while safely increasing density.

Other key storage enhancements

We’re continuing to enhance the data and cyber resilience capabilities of our storage platform to help customers combat the threat of ransomware and other data destructive events. Enhancements include:

  • IBM Spectrum Protect Plus offers a suite of enhancements specifically designed for Red Hat OpenShift and Kubernetes to support data protection for containerized environments. These include Red Hat certification, support for OpenShift workloads deployed on Azure and direct backup to S3 object storage.
  • IBM Spectrum Protect now supports replicating backup data to additional data protection servers.Additionally, IBM Spectrum Protect now supports using object storage for long-term data retention to reduce the cost of backup.
  • IBM Spectrum Scale global data fabric gains a new high-performance S3 object interface. This means that cloud native S3 applications can provide faster results without the typical delay for object storage. Additionally, a new GPU direct storage (GDS) interface enables NVIDIA applications to run up to 100% faster with IBM Spectrum Scale.
  • IBM Elastic Storage® System 3200 now includes a 38TB IBM FlashCore® Module. This new FlashCore Module is double the size of the previous, largest option, doubling the capacity of an ESS 3200 to 912TB in only 2 rack units.

Next steps

To learn more about the improvements we’ve made to our IBM Storage portfolio, contact our storage experts by filling out this form or calling sales at +1 877-426-4264 (Priority code: Storage).

 

Was this article helpful?
YesNo

More from Cloud

Announcing Dizzion Desktop as a Service for IBM Virtual Private Cloud (VPC)

2 min read - For more than four years, Dizzion and IBM Cloud® have strategically partnered to deliver incredible digital workspace experiences to our clients. We are excited to announce that Dizzion has expanded their Desktop as a Service (DaaS) offering to now support IBM Cloud Virtual Private Cloud (VPC). Powered by Frame, Dizzion’s cloud-native DaaS platform, clients can now deploy their Windows and Linux® virtual desktops and applications on IBM Cloud VPC and enjoy fast, dynamic, infrastructure provisioning and a true consumption-based model.…

Microcontrollers vs. microprocessors: What’s the difference?

6 min read - Microcontroller units (MCUs) and microprocessor units (MPUs) are two kinds of integrated circuits that, while similar in certain ways, are very different in many others. Replacing antiquated multi-component central processing units (CPUs) with separate logic units, these single-chip processors are both extremely valuable in the continued development of computing technology. However, microcontrollers and microprocessors differ significantly in component structure, chip architecture, performance capabilities and application. The key difference between these two units is that microcontrollers combine all the necessary elements…

Seven top central processing unit (CPU) use cases

7 min read - The central processing unit (CPU) is the computer’s brain, assigning and processing tasks and managing essential operational functions. Computers have been so seamlessly integrated with modern life that sometimes we’re not even aware of how many CPUs are in use around the world. It’s a staggering amount—so many CPUs that a conclusive figure can only be approximated. How many CPUs are now in use? It’s been estimated that there may be as many as 200 billion CPU cores (or more)…

IBM Newsletters

Get our newsletters and topic updates that deliver the latest thought leadership and insights on emerging trends.
Subscribe now More newsletters