Known issues and limitations for Watson Pipelines

These known issues and limitations apply to Watson Pipelines.

Known issues

Run again fails from the Run tracker UI

If you run a pipeline, either from a trial run or a job, and you choose Run again from the Run tracker page, value sets and parameter set default values are not propagated properly and the run can produce unexpected results. To avoid this issue, or to resolve it, start a new trial run or create and run a new job from the Run menu.

Applies to: 4.6.0 - 4.6.3.
Fixed in: 4.6.4

Adding parameters can break a pipeline job

If you create a job for a pipeline with no parameters, and then add parameters to that pipeline afterwards, any attempt to edit the job will result in a 400 error. To avoid this error, use at least 1 parameter when you create the job.

Applies to: 4.6.0 - 4.6.3.
Fixed in: 4.6.4

Nesting loops more than 2 levels can result in pipeline error

Nesting loops more than 2 levels can result in an error when you run the pipeline, such as Error retrieving the run. Reviewing the logs can show an error such as text in text not resolved: neither pipeline_input nor node_output. If you are looping with output from a Bash script, the log might list an error like this: PipelineLoop can't be run; it has an invalid spec: non-existent variable in $(params.run-bash-script-standard-output). To resolve the problem, do not nest loops more than 2 levels.

Applies to: 4.6.0 and higher

Asset browser does not always reflect count for total numbers of asset type

When selecting an asset from the asset browser, such as choosing a source for a Copy node, you see that some of the assets list the total number of that asset type available, but notebooks do not. That is a current limitation.

Cannot delete pipeline versions

Currently, you cannot delete saved versions of pipelines that you no longer need.

Deleting an AutoAI experiment fails under some conditions

Using a Delete AutoAI experiment node to delete an AutoAI experiment that was created from the Projects UI does not delete the AutoAI asset. However, the rest of the flow can complete successfully.

Cache appears enabled but is not enabled

If the Copy assets Pipelines node's Copy mode is set to Overwrite, cache is displayed as enabled but remains disabled.

Applies to: 4.6.4 and higher

Cannot remove the schedule for a pipeline job

When editing a scheduled Pipelines job to remove the schedule, the runs will continue to execute according to the previously defined schedule, even though it appears in the user interface that the schedule is removed.

To resolve this issue, delete and re-create the job. Deleting the job removes the associated kubecron configuration and stops the scheduled runs from executing.

Applies to: 4.6.x and higher

Variable size limit

User variables and parameter values such as RunJob stage parameters cannot exceed 2K.

To work around this issue, see Configuring the size limit for a user variable.

Applies to: 4.6.4 and higher

Limitations

These limitations apply to Watson Pipelines.

Single pipeline limits

These limitation apply to a single pipeline, regardless of configuration.

  • Any single pipeline cannot contain more than 120 standard nodes
  • Any pipeline with a loop cannot contain more than 600 nodes across all iterations (for example, 60 iterations - 10 nodes each)

Limitations by configuration size

Small configuration

A SMALL configuration supports 600 standard nodes (across all active pipelines) or 300 nodes run in a loop. For example:

  • 30 standard pipelines with 20 nodes run in parallel = 600 standard nodes
  • single pipeline containing a loop with 30 iterations and 10 nodes in each iteration = 300 nodes in a loop

Medium configuration

A MEDIUM configuration supports 1200 standard nodes (across all active pipelines) or 600 nodes run in a loop. For example:

  • 30 standard pipelines with 40 nodes run in parallel = 1200 standard nodes
  • single pipeline containing a loop with 60 iterations and 10 nodes in each iteration = 600 nodes in a loop

Large configuration

A LARGE configuration supports 4800 standard nodes (across all active pipelines) or 2400 nodes run in a loop. For example:

  • 80 standard pipelines with 60 nodes run in parallel = 4800 standard nodes
  • 4 pipelines containing a loop with 60 iterations and 10 nodes in each iteration = 2400 nodes in a loop

Input and output size limits

Input and output values, which include pipeline parameters, user variables, and generic node inputs and outputs, cannot exceed 10 KB of data. Environment variables cannot exceed 128 KB.

Batch input limited to data assets

Currently, input for batch deployment jobs is limited to data assets. This means that certain types of deployments, which require JSON input or multiple files as input, are not supported. For example, SPSS models and Decision Optimization solutions that require multiple files as input are not supported.

Node caches cannot be deleted

If you enable node caching, cache data cannot deleted. By changing the parameters or other configurations of a node you can invalidate the previous cache and Pipelines will not use the previous cache data.

Applies to: 4.6.3 or earlier

Parent topic: Limitations and known issues