Software-defined computing

Did you know you can cut costs by better managing your data transfers?

Share this post:

Are you transferring data efficiently? Or is valuable computing power sitting idle in your clustered environment, waiting for files to be moved back and forth—and costing you money?

For many organizations, critical workloads often take longer than necessary and use more resources than needed because the right data is not in the right location at the right time. Sometimes the same transfers occur over and over—wasting time, bandwidth and disk space.

With business users, customers and partners all accessing the same files inefficiently, application performance can suffer as data continually moves from storage resources to compute resources and back again. And there is often little visibility into these data transfers and not much control over them, making it difficult to prioritize specific jobs and projects.

So, what is the solution? The new IBM® Platform™ Data Manager for LSF® increases efficiency by enabling the intelligent scheduling of data within and between clusters, as well as to and from the cloud, minimizing wasted computing cycles, conserving disk space and ultimately reducing costs.

By allowing transfers to be controlled as jobs in IBM Platform LSF, Platform Data Manager gives you precise command over data movement by workload and project. Platform Data Manager conducts data transfers independently of cluster workloads so you can get the most out of your resources.

A smart, managed cache preserves disk space and bandwidth by eliminating the need to repeatedly move the same data back and forth. Transferred files can be automatically cached on the execution cluster and copies can be leveraged for all workloads that need access, with multiple users having the option of sharing the data if needed. Workloads running on execution clusters can also write intermediate data to the local cache for use by other jobs. You need less disk space, so you save on storage costs.

Organizations with multiple Platform LSF clusters benefit from data affinity, with Platform Data Manager deciding where to forward a job by giving preference to clusters that already have the needed files. Even within a single cluster, Platform Data Manager is able to schedule the transfer of files separately from a primary workload, allowing data to be queued up ahead of time for upcoming tasks rather than waiting for the resources to free up.

Platform Data Manager provides centralized visibility and management of data transfers, saving administrative time and costs, and allows for precise management by workload and project. Administrators can even choose the mechanism for moving data, enabling organizations to leverage the underlying file transfer infrastructures already in place.

Start cutting costs by better managing your data transfers with IBM Platform Data Manager for LSF features, including intelligent staging, managed cache and enhanced administrative options. Improve the visibility of data transfers and minimize duplicate requests for the same information to ensure your IT resources are being used to their fullest extent.

For more information on IBM Platform Data Manager for LSF, visit:

More Software-defined computing stories

Container data availability and reuse

Cloud computing, DevOps, Hybrid cloud storage...

Containers provide rapid deployment in a lightweight framework. They are ideal for scaling up and down services, rapid provisioning for development and an integral part of many DevOps workflows. Containers have less overhead than virtual machines and the flexibility for practically any workload. As organizations adopt containers more broadly, data management and availability requirements have more

PCI DSS security compliance on IBM AIX

Data security, IBM Systems Lab Services, Power servers...

The Payment Card Industry Data Security Standard (PCI DSS) is a widely accepted set of policies and procedures intended to optimize the security of credit, debit and cash card transactions, and to protect cardholders against misuse of their personal information. While most companies have to meet numerous regulatory requirements, they often fail to maintain their more

Why use IBM DevOps tools to deploy applications to the mainframe?

DevOps, Mainframes, Servers...

One of the areas in software delivery that drives the most value for your dollar is automating release deployments.  This is especially true in mainframe environments, where lengthy homegrown processes for checking in and deploying code have been used for years in order to safeguard business-critical applications. Why would you seek out IBM for software more