environment list

List all environments in a specific project. Environments are filtered by type and include all default environments, which are available in every project.

Syntax

cpd-cli environment list \
[--context=<catalog-project-or-space-id>] \
[--cpd-config=<cpd-config-location>] \
[--cpd-scope=<cpd-scope>] \
[--exclude-languages=<comma-separated-kernel-name-list>] \
[--exclude-types=<comma-separated-environment-type-list>] \
[--jmes-query=<jmespath-query>] \
[--output=json|yaml|table] \
[--output-file=<output-file-location>] \
--profile=<cpd-profile-name> \
[--project-id=<cpd-project-id>] \
[--quiet] \
[--raw-output=true|false] \
[--space-id=<space-identifier>] \
[--spark-versions=<comma-separated-list-of-spark-versions>] \
[--types=<comma-separated-list-of-environment-types>] \
[--verbose]

Arguments

The environment list command has no arguments.

Options

Table 1: Command options
Option Description
--context Specify the configuration context
 name.
Status
Optional.
Syntax
--context=<catalog-project-or-space-id>
Default value
Not applicable.
Valid values
A valid configuration context name.
--cpd-config The Cloud Pak for Data
 configuration location. For example, $HOME/.cpd-cli/config.
Status
Required.
Syntax
--cpd-config=<cpd-config-location>
Default value
$HOME/.cpd-cli/config
Valid values
A valid Cloud Pak for Data configuration location.
--cpd-scope The Cloud Pak for Data space, project, or catalog scope. For example, cpd://default-context/spaces/7bccdda4-9752-4f37-868e-891de6c48135.
Status
Optional.
Syntax
--cpd-scope=<cpd-scope>
Default value
No default.
Valid values
A valid Cloud Pak for Data space, project, or catalog scope.
--exclude-languages Specify a comma-separated list of kernel names to exclude.
Status
Optional.
Syntax
--exclude-languages=<comma-separated-kernel-name-list>
Default value
Empty list.
Valid values
A valid comma-separated list of kernel names.
--exclude-types Specify a comma-separated list of environment types to exclude.
Status
Optional.
Syntax
--exclude-types=<comma-separated-environment-type-list>
Default value
Empty list.
Valid values
A valid comma-separated list of environment types.

--help

-h

Display command help.
Status
Optional.
Syntax
--help
Default value
No default.
Valid values
Not applicable.
--jmes-query The Lucene query.
Status
Required.
Syntax
--query=<lucene-query>
Default value
No default.
Valid values
A valid Lucene query.
--output Specify an output format.
Status
Optional.
Syntax
--output=json|yaml|text
Default value
text
Valid values
Valid formats include JSON, YAML, or text (the default format).
--output-file Specify a file path where all output is redirected.
Status
Optional.
Syntax
--output-file=<output-file-location>
Default value
No default.
Valid values
A valid output file path location.
--profile The name of the profile that you created to store information about an instance of Cloud Pak for Data and your credentials for the instance.
Status
Required.
Syntax
--profile=<cpd-profile-name>
Default value
No default.
Valid values

The name of the profile that you created.

--project-id Specify a Cloud Pak for Data project instance.
Status
Required.
Syntax
--project-id=<cpd-project-id>
Default value
No default.
Valid values
A valid project identifier.
--quiet Suppress verbose messages.
Status
Optional.
Syntax
--quiet
Default value
No default.
Valid values
Not applicable.
--raw-output When set to true, single values are not surrounded by quotation marks in
 JSON output mode.
Status
Optional.
Syntax
--raw-output=true|false
Default value
false
Valid values
false
Single values in JSON output mode are surrounded by quotation marks.
true
Single values in JSON output mode are not surrounded by quotation marks.
--space-id Specify a space identifier.
Status
Required.
Syntax
--space-id=<space-identifier>
Default value
No default.
Valid values
A valid space identifier.
--spark-versions Specify a comma-separated list of Spark versions. When a list is provided, the response contains only the default Spark environments for the specified versions.
Status
Optional.
Syntax
--spark-versions=<comma-separated-list-of-spark-versions>
Default value
All Spark versions.
Valid values
A valid comma-separated list of Spark versions.
--types Specify a comma-separated list of environment types.
Status
Optional.
Syntax
--types=<comma-separated-list-of-environment-types>
Default value
All environment types.
Valid values
A valid comma-separated list of environment types.
--verbose Logs include more detailed messages.
Status
Optional.
Syntax
--verbose
Default value
No default.
Valid values
Not applicable.

Examples

Note: The following examples use environment variables. Use a script to create environment variables with the correct values for your environment. You can add the listed environment variables to the installation variables script. For more information, see Setting up installation environment variables.
Specify the environment variables in a script file.
environment_name = "Default Python 3.7"
query_string = "(resources[?entity.environment.display_name == '{}'].metadata.asset_id)[0]".format(environment_name)
List all the project environments and filter them by their display name.
cpd-cli environment list \
--jmes-query=${query_string} \
--output=json \
--project-id=${PROJECT_CPD_INST_OPERANDS} \
--raw-output