IBM ProtecTIER Implementation and Best Practices Guide
Megan_Gilge 06000101S8 Visits (4560)
Read the latest draft of the second edition of the IBM ProtecTIER Implementation and Best Practices Guide, SG24-8025.
This guide provides best practices and expertise gained from the IBM ProtecTIER Field Technical Sales Support (FTSS/CSS) Group, development, and Quality Assurance teams to boost the performance and the effectiveness of the data deduplication for your ProtecTIER family of products and application platforms.
In addition to Virtual Tape Library (VTL) and Open Storage Technology (OST) support, ProtecTIER Version 3.3 provides options to manage backups with File System Interface (FSI) for the Network File System (NFS) and for the Common Internet File System (CIFS). This new FSI support provides the following benefits:
The ProtecTIER system can be integrated into an existing backup solution, and provides deduplication when saving files with backup solutions such as IBM Tivoli Storage Manager, Symantec NetBackup and BackupExec, EMC Networker, and IBM i Backup, Recovery, and Media Services (BRMS). The FSI support helps facilitate rapid data restoration, and the IBM HyperFactor algorithm helps maximize disk usage by eliminating data duplicates from the incoming backup data streams.
The following figure shows a configuration with Tivoli Storage Manager backing up data to and restoring from the self contained disk storage in an IBM ProtecTIER TS7620 Deduplication Appliance Express. The ProtecTIER Manager Graphical User Interface (GUI) is used to monitor the TS7620 Deduplication Express. With Version 3.3, it can be used to upgrade the ProtecTIER system to new code levels. The ProtecTIER Management IP interface and the ProtecTIER file system application interfaces are separated by different network subnets for best performance.
ProtecTIER deduplication can be used to perform faster, more frequent backups and quickly restore backed up data to maintain business continuity. Using data deduplication, you can reduce the amount of space required to store data on disk. With deduplication, repeated instances of identical data are identified and stored in a single instance. This process saves storage capacity and bandwidth. Data deduplication can provide greater data reduction than previous technologies, such as Lempel-Ziv (LZ) compression and differencing, which is used for differential backups.
Data deduplication is performed while the data is being backed up to the ProtecTIER (inline) server, in contrast to after the data is written to it (post processing). The advantage of inline data deduplication is that the data is processed only once and there is no additional processing performed after the backup window. Inline data deduplication requires less disk storage because the native data is not stored before data deduplication.
Depending your configuration, this solution can provide the following benefits:
IBM ProtecTIER solutions allow you to take advantage of HyperFactor technology, which uses a pattern algorithm that can reduce the amount of space required for storage of the backup environment by a factor of up to 25 times, based on evidence from existing implementation. The effect and result of HyperFactor processing is a factoring ratio. In simple words, the factoring ratio is the ratio of nominal data (as a sum of all user data backup streams) to the occupied physical storage in the ProtecTIER repository. For example, a 10:1 ratio means that 10 times more nominal data is being managed than the physical space required to store it. Also, the 10:1 ratio means a 90% storage savings. The effectiveness of data deduplication is dependent upon many variables, including the data rate of data change, the number of backups, and the data retention period.
Read the IBM ProtecTIER implementation and Best Practices Guide, SG24-8024, for more information.