IBM Support

Enhanced Customer Data Repository (ECuRep) - z/OS Help

General Page

Help page for ECuRep transfer of z(OS debug data

 


APAR for supporting new HTTPS upload to www.ecurep.ibm.com

Follow the recommendations here: OA66961: PDUU SUPPORT OF ECUREP.IBM.COM

IBM PDUU encryption support

Always check the "Send data – Encryption" page for needed Root certificate, supported TLS protocols and Cipher suites to be utilized within clients z/OS setup.

IBM PDUU authentication

To upload case data to ECuRep using IBM PDUU (FTPS or HTTPS), you need to generate an IBM Support File Transfer ID and password. This ID, linked to your IBM ID, remains active until revoked. The password is displayed only once during creation, so keep it safe. If you lose the password, you'll need to delete the ID and create a new one.

PDUU with HTTPS

ECuRep recommends using PDUU utilizing HTTPS protocol and without an AT-TLS policy to secure the connection and with PDUU configuration set encryption settings.

Clients still preferring to utilize AT-TLS rules instead, please note that SNI must be turned on for that connection manually!
Hints for that can be found in Technote “AMAPDUPL using HTTPS to connect to www.ecurep.ibm.com with an AT-TLS policy to secure the connection, fails with "checkServerCert: Certificate not valid for DNS name"".
 
If in need for additional support, please open an IBM MySupport skill case for component “z/OS->Service Aids->AMATERSE/AMASPZAP/PDDU".

IBM PDUU updates

Check for PDUU updates at IBM Support for PDUU.

IBM PDUU and compression

Be aware of that IBM PDUU uses an internal compression feature prior of transferring data to IBM. Use of extra file compression tools like AMATERSE is not necessary, nor recommended when using IBM PDUU.

IBM PDUU parallel sessions

Start with three or four parallel FTP sessions. Too many parallel FTP sessions can saturate the network link.
For further hints and tips, consult the MVS Diagnosis: Tools and Service Aids manual (PDF,3.38MB) (go to chapter 18).

IBM PDUU amount of work data sets

Use medium size work data sets.
If the work data sets are small in relationship to the input data set, you can end up with too many files on the IBM FTP sites.
For example, if you are sending a 100 GB z/OS stand-alone dump and make the work data set size 1 MB, you create 100,000 files on the IBM FTP site. This exceeds the IBM limit of 999 files.
This setting also causes much delay by starting and stopping the FTP sessions for each file.
For further hints and tips, consult the MVS Diagnosis: Tools and Service Aids manual (PDF,3.38MB) (go to chapter 18).

Additionally, follow the IBM limit of 999 directories as the max. number of PDUU (mtftp) uploads per Case (ticket).

IBM PDUU encryption password

As IBM PDUU uploads data to ECuRep enforcing encryption in-transit by providing secure FTPS or HTTPS targets only, it is strongly recommended not to use password/cipher data encryption anymore. It is a manual and error-prone process to provide the password separately to IBM and also for IBM Support to process the data.
For further hints and tips, consult the MVS Diagnosis: Tools and Service Aids manual (PDF,3.38MB) (go to chapter 18).

More hints and tips for using HTTPS upload by using IBM PDUU

  • Correct Certificates must be in Keyring
    Refer to our  Send Data - HTTPS page for the needed certificates and according download links
     

    You need to download one or more certificates and put them in your security product database (for instance, RACF)
    Normally, you would then connect it to the keyring used by the PDUU job.
    Or, if using CERTAUTH virtual keyring , that is HTTPS_KEYRING=*AUTH*/*)
    This setting means that you only need to get this cert into RACF, make sure it is TRUSTED.
    If the DIGTCERT and DIGTRING classes are RACLISTed on your system, refresh the classes and you can do so with command:                                                                                
    SETROPTS REFRESH RACLIST(DIGTRING,DIGTCERT)
     
  • TLS 1.3 or TLS 1.2 and current ciphers must be configured
    If getting  '402 No SSL cipher specification' error, ensure that current TLS version and ciphers are used.
     
    AMAPDUPL using HTTPS ends up using the default CIPHERs for the secured connection; the default CIPHERS are 3538392F3233 (6 2-character cipher numbers)
    Configurable ciphers are listed in the 'z/OS Cryptographic Services System SSL Programming'.
    As Default ciphers are considered weak, ensure using a cipher suite considered as strong.
    Here is a list of supported cipher suites for different targets and protocols: 'ECuRep encryption information'.
     

IBM z/OS special Data handling

Notes for sending z/OS-related special dataset types:  
 
  1. SMF data
    To prevent problems with SMF data, which is mostly in SPANNED record format where the logical record length may be larger than the dataset’s blocksize, adhere to the following procedure:  

    • Due to the mostly large amount of data, select ONLY the NEEDED and USEFUL
    record types when creating the unloaded SMF data.
     
    • Use the TSO TRANSMIT command to convert your unloaded SMF data into the TSO XMIT format, for example : 
    TSO XMIT yournode.youruser DA('yoursmfdsn') OUTDATASET('yoursmfdsn.XMI')
     
    • Now TERSE the output dataset and name it: 'yoursmfdsn.XMI.TRS'
     
    • Transfer this TERSED file to ECuRep upload site and ensure to use the correct naming conventions
     
  2. Other data with special record format (for example Fault Analyzer History file)
    Handle this data exactly as described under “SMF data”.
     
  3. GDG’s (Generation Data Group’s)
    When there is a need to transfer one (or more) GDG-format dataset(s):
     REMOVE or RENAME the GDG-typical qualifier GnnnnVnn  from the data set name, so that this data set name cannot anymore be identified as a GDG.

    Reason: 
    when a tersed GDG dataset is turned back to its original format, MVS would encounter the GDG-typical qualifier and would try to allocate a GDG dataset as well – which fails because the GDG base is not available on the ECuRep MVS system.
    This would inhibit automatic processing of that dataset.
     
  4. Using DFSMSdss DUMP (ADRDSSU)
    When using DFSMSdss (ADRDSSU) to DUMP one or more datasets, be SURE to LOGICALLY dump your data set(s) – do NOT use PHYSICAL dump.

Related links

[{"Type":"MASTER","Line of Business":{"code":"","label":""},"Business Unit":{"code":"","label":""},"Product":{"code":"ECUREP","label":"ECuRep notice"},"ARM Category":[],"Platform":[{"code":"PF025","label":"Platform Independent"}],"Version":"All Versions"}]

Document Information

Modified date:
24 November 2025

UID

ibm10739575