Limitations and known issues for DataStage

The following Known issues and limitations apply to DataStage.

Known issues

Known issues for general areas:

5 or more PXRuntime instances cannot be created without increasing operator memory
To create 5 or more PXRuntime instances, a user must update the CSV and increase the memory limit of the operator pod. Workaround: Get the DataStage cluster service version in the operator namespace.
oc -n ${PROJECT_CPD_OPS} get csv | grep ibm-cpd-datastage-operator
Patch the CSV to increase operator pod memory to 2Gi.
oc -n ${PROJECT_CPD_OPS}  patch csv <datastage-csv-name> --type='json' -p='[{"path": "/spec/install/spec/deployments/0/spec/template/spec/containers/0/resources/limits/memory", "value": "2Gi", "op": "replace"}]'

Applies to: 4.6.5

Custom resource is in a Failed state after operator upgrade

The DataStage custom resource may be in a Failed state after the operator is upgraded from 4.0.x to 4.6.0 but before the version in the custom resource gets updated. This will not affect the actual upgrade. Workaround: Continue with the upgrade by running the command cpd-cli manage apply-cr to upgrade DataStage to 4.6.0.

Applies to: 4.6.0 and 4.6.1

Fixed in: 4.6.2

Match designer test results are empty when an incorrect input schema is provided

When designing a match specification, if the input schema does not match the actual schema of the sample data, the test results will be empty but no warning will be issued.

Applies to: 4.6.0, 4.6.1, and 4.6.2

Fixed in: 4.6.3

Migrating Azure storage connector: Property read_mode for stage has an invalid value

Migrated flows with Azure storage connector will fail to compile if usage property read_mode is set to Download, List containers/fileshares, or List files. The selected read mode will be unavailable.

Applies to: 4.6.0 and later

Non-ASCII characters are not recognized in several Watson Pipeline fields
In the following fields in Watson Pipeline, non-ASCII characters cannot be used
  • Pipeline/job parameter name
  • User variable name
  • Environment variable name
  • Output variable name
  • Email address

Applies to: 4.6.0 and later

Inaccessible runtime instances can be selected
When you configure the environment, you might be allowed to select a runtime instance that you do not have permission to access.

Applies to: 4.6.0

Fixed in: 4.6.1

Known issues for stages:

The Change Apply stage adds a case-sensitive option to non-string fields
The Change Apply stage adds a case-sensitive option to non-string fields, causing unrecognized format and unrecognized parameter warnings. Workaround: Double click the Change Apply stage to open the details card. Select the input link with a non-string key column with the case-sensitive option ci-cs in its pipelineJSON file. Click Save and compile and run the flow.

Applies to: 4.6.0-4.6.3

Fixed in: 4.6.4

Adding a stage immediately prior to a Hierarchical stage breaks it
When a stage is added prior to a Hierarchical stage, the input link is not renamed. The input link must be manually renamed for the stage to work.

Applies to: 4.6.0

Fixed in: 4.6.1

Known issues for connectors:

Some connections do not support flow connections

The following connections do NOT support the <Flow connection> option:

  • Apache HDFS. Applies to: 4.6.1 - 4.6.5. Fixed in: 4.6.6.
  • Exasol. Applies to: 4.6.5. Fixed in: 4.6.6.
  • Generic JDBC. Applies to: 4.6.1 - 4.6.2. Fixed in: 4.6.3.
  • IBM® Cloud Object Storage. Applies to: 4.6.1. Fixed in: 4.6.2.
  • IBM Cognos® Analytics. Applies to: 4.6.1 and later.
  • IBM Data Virtualization Manager for z/OS®. Applies to: 4.6.1 and later.
  • IBM Db2® Event Store. Applies to: 4.6.1 and later.
  • IBM Db2 for i. Applies to: 4.6.1. Fixed in: 4.6.2.
  • IBM Db2 for z/OS. Applies to: 4.6.1 - 4.6.2. Fixed in: 4.6.3.
  • IBM Db2 on Cloud. Applies to: 4.6.1 and later.
  • IBM Match 360. Applies to: 4.6.1 and later.
  • IBM Watson® Query. Applies to: 4.6.1 and later.
  • Microsoft Azure Data Lake Store. Applies to: 4.6.1 and later.
  • SAP Bulk Extract. Applies to: 4.6.1 - 4.6.2. Fixed in: 4.6.3.
  • SAP Delta Extract. Applies to: 4.6.1 - 4.6.2. Fixed in: 4.6.3.
  • SAP HANA. Applies to: 4.6.1 - 4.6.2. Fixed in: 4.6.3.
  • SAP IDoc. Applies to: 4.6.1 - 4.6.3. Fixed in: 4.6.4.
  • SAP OData. Applies to: 4.6.1 - 4.6.2. Fixed in: 4.6.3.
  • Storage volume. Applies to: 4.6.1 and later.
Jobs with an Apache Kafka connection fail with "Received fatal alert: bad_certificate" message

One of the reasons that a job might fail is that it uses data from an Apache Kafka connection and the connection is configured to use SSL and a DSA certificate. Certificate-key pairs that are generated with the DSA algorithm are no longer supported with the TLS v1.3 protocol. Both the client and server must use the same TLS version. Use one of two workarounds to fix the problem.

Workaround 1: Add the CC_JVM_OPTIONS environment variable to the job and force using TLS v1.2 protocol:

CC_JVM_OPTIONS=-Djdk.tls.client.protocols=TLSv1.2 -Dhttps.protocols=TLSv1.2

Workaround 2: Regenerate the TLS certificate with the RSA algorithm: -keyalg RSA

Applies to: 4.6.0 - 4.6.2

Fixed in: 4.6.3

Salesforce.com (optimized) connection does not support vaults

The Input method Use secrets from a vault is not supported for the Salesforce.com (optimized) connection.

Applies to: 4.6.0 and later

FTP connector does not support multiple URIs

Multiple user information value sets are not supported on the FTP connectors. When you migrate jobs containing FTP Enterprise with multiple URI/username/password sets, the FTP connector will only accept the first value set provided. All other value sets will be dropped. You will need to design or edit the job with an FTP connector for each URI, and either funnel (if used as source) or copy (if used as target).

Applies to: 4.6.0 - 4.6.3

Fixed in: 4.6.4

FTP connector does not support multiple input/output links

Multiple input/output links are not supported on the FTP connector. You can import jobs containing FTP Plugin with multiple input/output links, but they will fail to compile. You will need to design or edit the jobs with an FTP connector for each input output link, and either funnel (if used as source) or copy (if used as target).

Applies to: 4.6.0 - 4.6.3

Fixed in: 4.6.4

SCRAM-SHA-256 authentication method is not supported for the ODBC MongoDB data source

If you create an ODBC connection for a MongoDB data source that uses the SCRAM-SHA-256 authentication method (AM), the job will fail.

Workaround: Change the server-side authentication to SCRAM-SHA-1. Alternatively, use the MongoDB connection or the Generic JDBC connection.

Applies to: 4.6.0 and later

Limitations

Limitations for general areas:

Reading FIFOs on persistent volumes across pods causes stages to hang
Reading FIFOs on persistent volumes across pods is not supported and causes the stage reading the FIFO to hang. Workaround: Constrain the job to a single pod by setting APT_WLM_COMPUTE_PODS=1.

Applies to: 4.6.6 and later

Unassigned environment variables and parameter sets are not migrated
Environment variables and parameter sets that have not been assigned a value will be skipped during export. When jobs are migrated, they contain only those environment variables and parameter sets that have been assigned a value for that job.

Applies to: 4.6.2 and later

No more than 120 nodes can be used in an orchestration flow
An orchestration flow that contains more than 120 nodes will not work.

Applies to: 4.6.0 and later

Limitations for connectors:

Error parameterizing the credential field for a flow connection in IBM Cloud Object Storage
When the Authentication method property is set to Service credentials (full JSON snippet), do not parameterize the Service credentials field. If a parameter is provided for that field, the flow will not compile.

Applies to: 4.6.0 and later

Previewing data and using the asset browser to browse metadata do not work for these connections:
  • Apache Cassandra (optimized)
  • Apache HBase
  • IBM MQ
"Test connection" does not work for these connections:
  • Apache Cassandra (optimized)
  • Apache HBase