environment list

List all environments in a specific project. Environments are filtered by type and include all default environments (which are available in every project).

Syntax

cpd-cli environment list \
[--context=<catalog-project-or-space-id>] \
[--cpd-config=<cpd-config-location>] \
[--cpd-scope=<cpd-config-location>] \
[--exclude-languages=<comma-separated-kernel-name-list>] \
[--exclude-types=<comma-separated-environment-type-list>] \
[--jmes-query=<jmespath-query>] \
[--output=json|yaml|table] \
[--output-file=<output-file-location>] \
--profile=<cpd-configuration-profile-name> \
[--project-id=<cpd-project-id>] \
[--quiet] \
[--raw-output=true|false] \
[--space-id=<space-identifier>] \
[--spark-versions=<comma-separated-list-of-spark-versions>] \
[--types=<comma-separated-list-of-environment-types>] \
[--verbose]

Arguments

The environment list command has no arguments.

Options

Option Description
--context Specify the configuration context 
name.
Status
Optional.
Syntax
--context=<catalog-project-or-space-id>
Default value
Not applicable.
Valid values
A valid configuration context name.
--cpd-config The Cloud Pak for Data
configuration location (for example, $HOME/.cpd-cli/config).
Status
Required.
Syntax
--cpd-config=<cpd-config-location>
Default value
No default.
Valid values
A valid Cloud Pak for Data configuration location.
--cpd-scope The Cloud Pak for Data space, project, or catalog scope (for example, cpd://default-context/spaces/7bccdda4-9752-4f37-868e-891de6c48135).
Status
Optional.
Syntax
--cpd-scope=<cpd-config-location>
Default value
No default.
Valid values
A valid Cloud Pak for Data space, project, or catalog scope.
--exclude-languages Specify a comma-separated list of kernel names to exclude.
Status
Optional.
Syntax
--exclude-languages=<comma-separated-kernel-name-list>
Default value
Empty list.
Valid values
A valid comma-separated list of kernel names.
--exclude-types Specify a comma-separated list of environment types to exclude.
Status
Optional.
Syntax
--exclude-types=<comma-separated-environment-type-list>
Default value
Empty list.
Valid values
A valid comma-separated list of environment types.

--help

-h

Display command help.
Status
Optional.
Syntax
--help
Default value
No default.
Valid values
Not applicable.
--jmes-query The Lucene query.
Status
Required.
Syntax
--query=<lucene-query>
Default value
No default.
Valid values
A valid lucene query.
--output Specify an output format. Valid formats include json, yaml, or text (the default format).
Status
Optional.
Syntax
--output=json|yaml|text
Default value
text
Valid values
json|yaml|text
--output-file Specify a file path where all output is redirected.
Status
Required.
Syntax
--output-file=<output-file-location>
Default value
No default.
Valid values
A valid output file path location.
--profile The profile-name from the Cloud Pak for Data configuration.
Status
Required.
Syntax
--profile=<cpd-configuration-profile-name>
Default value
default
Valid values
Any valid profile name from the Cloud Pak for Data configuration.
--project-id Specify a Cloud Pak for Data project instance.
Status
Required.
Syntax
--project-id=<cpd-project-id>
Default value
No default.
Valid values
A valid project identifier.
--quiet Suppress verbose messages.
Status
Optional.
Syntax
--quiet
Default value
No default.
Valid values
Not applicable.
--raw-output When set to true, single values in
 JSON output mode are not surrounded by quotes.
Status
Optional.
Syntax
--raw-output=true|false
Default value
false
Valid values
false
Single values in JSON output mode are surrounded by quotes.
true
Single values in JSON output mode are not surrounded by quotes.
--space-id Specify a space identifier.
Status
Required.
Syntax
--space-id=<space-identifier>
Default value
No default.
Valid values
A valid space identifier.
--spark-versions Specify a comma-separated list of Spark versions. When a list is provided, the response contains only the default spark environments for the specified versions.
Status
Optional.
Syntax
--spark-versions=<comma-separated-list-of-spark-versions>
Default value
All Spark versions.
Valid values
A valid comma-separated list of Spark versions.
--types Specify a comma-separated list of environment types.
Status
Optional.
Syntax
--types=<comma-separated-list-of-environment-types>
Default value
All environment types.
Valid values
A valid comma-separated list of environment types.
--verbose Logs include more detailed messages.
Status
Optional.
Syntax
--verbose
Default value
No default.
Valid values
Not applicable.

Examples

Note: The following examples use environment variables. Use a script to create environment variables with the correct values for your environment. You can optionally add the listed environment variables to the install variables script. For more information, see Best practice: Setting up install variables.
Specify the environment variables in a script file.
environment_name = "Default Python 3.7"
query_string = "(resources[?entity.environment.display_name == '{}'].metadata.asset_id)[0]".format(environment_name)
List all the project environments and filter them by their display name.
cpd-cli environment list \
--jmes-query=${query_string} \
--output=json \
--project-id=${PROJECT_CPD_INSTANCE} \
--raw-output