Failback with migrated file on old primary and no change from acting primary

This use case describes failing back with migrated file on old primary.

Fileset: primary
State of the file in latest RPO at primary: resident
Action on the file from primary after latest RPO: migrate
Primary disaster, failover to secondary, old primary came back up, run failback start
State of the file in the old primary after failback start: migrated with no blocks 
Current state of the file on acting primary: R with no mtime change
State of the file in the old primary after applyUpdates: migrated with no blocks 

#create a new file from primary, flush to secondary

hs22n19; Thu Sep 24 08:08:47; tests# ls -lashi /gpfs/fs2/drHSM-DRP/file10

9439113 1.0M -rw-r--r-- 1 root root 1.0M Sep 24 07:28 /gpfs/fs2/drHSM-DRP/file10
0 ;) hs22n19; Thu Sep 24 08:09:40; tests# mmafmctl fs2 getstate -j drHSM-DRP

Fileset Name Fileset Target Cache State Gateway Node Queue Length Queue numExec 
------------ -------------- ------------- ------------ ------------ ------------- 
drHSM-DRP nfs://hs21n30/gpfs/fs1/drHSM-DRS Active hs22n19 0 103 


hs22n21; Thu Sep 24 08:08:23; scripts# dsmls /gpfs/fs2/drHSM-DRP/file10 | grep file10

1048576 1048576 1024 r file10

#Migrate file after RPO snapshot

hs22n21; Thu Sep 24 08:10:41; scripts# dsmmigrate /gpfs/fs2/drHSM-DRP/file10

ANS0102W Unable to open the message repository /opt/tivoli/tsm/client/ba/bin/EN_IN/dsmclientV3.cat. The American English repository will be used instead.
Tivoli Storage Manager
Command Line Space Management Client Interface
Client Version 7, Release 1, Level 3.0
Client date/time: 09/24/2015 08:11:36 EDTA
(c) Copyright by IBM Corporation and other(s) 1990, 2015. All Rights Reserved.

0 ;) hs22n21; Thu Sep 24 08:11:41; scripts# dsmls /gpfs/fs2/drHSM-DRP/file10 | grep file10

1048576 0 0 m file10

#Run failoverToSecondary at secondary

hs21n30; Thu Sep 24 08:12:15; fs1# mmafmctl fs1 failoverToSecondary -j drHSM-DRS --restore

mmafmctl: failoverToSecondary restoring from psnap failback-psnap-rpo-C0A874465450C18C-46-15-09-24-07-37-03
[2015-09-24 08:12:20] Restoring fileset "drHSM-DRS" from snapshot "failback-psnap-rpo-C0A874465450C18C-46-15-09-24-07-37-03" of filesystem "/dev/fs1"

[2015-09-24 08:12:25] Scanning inodes, phase 1 ...
[2015-09-24 08:12:28] 5777216 inodes have been scanned, 50% of total.
[2015-09-24 08:12:32] 11554432 inodes have been scanned, 100% of total.
[2015-09-24 08:12:32] Constructing operation list, phase 2 ...
[2015-09-24 08:12:32] 0 operations have been added to list.
[2015-09-24 08:12:32] Deleting the newly created files, phase 3 ...
[2015-09-24 08:12:33] Deleting the newly created hard links, phase 4 ...
[2015-09-24 08:12:34] Splitting clone files, phase 5 ...
[2015-09-24 08:12:34] Deleting the newly created clone files, phase 6 ...
[2015-09-24 08:12:35] Moving files, phase 7 ...
[2015-09-24 08:12:36] Reconstructing directory tree, phase 8 ...
[2015-09-24 08:12:36] Moving files back to their correct positions, phase 9 ...
[2015-09-24 08:12:37] Re-creating the deleted files, phase 10 ...
[2015-09-24 08:12:38] Re-creating the deleted clone parent files, phase 11 ...
[2015-09-24 08:12:38] Re-creating the deleted clone child files, phase 12 ...
[2015-09-24 08:12:39] Re-creating the deleted hard links, phase 13 ...
[2015-09-24 08:12:40] Restoring the deltas of changed files, phase 14 ...
[2015-09-24 08:12:41] Restoring the attributes of files, phase 15 ...
[2015-09-24 08:12:41] Restore completed successfully.
[2015-09-24 08:12:41] Clean up.
Primary Id (afmPrimaryId) 12646758592946767367-C0A874195583B621-19
Fileset drHSM-DRS changed.
hs21n30; Thu Sep 24 08:16:06; drHSM-DRS# dsmls file10 | grep file10

1048576 1048576 1024 r file10

# Start failbackToPrimary at old primary

hs22n19; Thu Sep 24 08:11:24; tests# mmafmctl fs2 failbackToPrimary -j drHSM-DRP --start

Fileset drHSM-DRP changed.
mmafmctl: failbackToPrimary restoring from psnap failback-psnap-rpo-C0A874465450C18C-46-15-09-24-07-37-03
[2015-09-24 08:16:33] Restoring fileset "drHSM-DRP" from snapshot "failback-psnap-rpo-C0A874465450C18C-46-15-09-24-07-37-03" of filesystem "/dev/fs2"

[2015-09-24 08:16:36] Scanning inodes, phase 1 ...
[2015-09-24 08:16:38] 9447232 inodes have been scanned, 100% of total.
[2015-09-24 08:16:38] There's no data changes since the restoring snapshot, skipping restore.
[2015-09-24 08:16:38] Restore completed successfully.
[2015-09-24 08:16:38] Clean up.

#ApplyUpdates on old primary

0 ;) hs22n19; Thu Sep 24 08:16:44; tests# mmafmctl fs2 applyUpdates -j drHSM-DRP

[2015-09-24 08:16:48] Getting the list of updates from the acting Primary...
[2015-09-24 08:17:01] Applying the 12 updates...
[2015-09-24 08:17:02] 12 updates have been applied, 100% of total.
mmafmctl: Creating the failback psnap locally. failback-psnap-rpo-C0A874465450C18C-46-15-09-24-08-16-47
Flushing dirty data for snapshot drHSM-DRP::failback-psnap-rpo-C0A874465450C18C-46-15-09-24-08-16-47...
Quiescing all file system operations.
Snapshot drHSM-DRP::failback-psnap-rpo-C0A874465450C18C-46-15-09-24-08-16-47 created with id 9019.
mmafmctl: Deleting the old failback psnap. failback-psnap-rpo-C0A874465450C18C-46-15-09-24-07-37-03
Invalidating snapshot files in drHSM-DRP::failback-psnap-rpo-C0A874465450C18C-46-15-09-24-07-37-03...
Deleting files in snapshot drHSM-DRP::failback-psnap-rpo-C0A874465450C18C-46-15-09-24-07-37-03...
100.00 % complete on Thu Sep 24 08:17:05 2015 ( 10048 inodes with total 0 MB data processed)
Invalidating snapshot files in drHSM-DRP::failback-psnap-rpo-C0A874465450C18C-46-15-09-24-07-37-03/F/...
Delete snapshot drHSM-DRP::failback-psnap-rpo-C0A874465450C18C-46-15-09-24-07-37-03 complete, err = 0

#State of the file in the old primary after applyUpdates

hs22n21; Thu Sep 24 08:11:51; scripts# dsmls /gpfs/fs2/drHSM-DRP/file10 | grep file10

1048576 0 0 m file10

#State of the file in the acting primary after applyUpdates

hs21n30; Thu Sep 24 08:16:26; drHSM-DRS# dsmls file10 | grep file10

1048576 1048576 1024 r file10