Managing compute resources
If you have the Admin role or Editor in a project, you can perform management tasks for environments.
Stop active runtimes
Stop all active runtimes when you no longer need them to prevent consuming extra capacity unit hours (CUHs).
Jupyter notebook runtimes are started per user and not per notebook. Stopping a notebook kernel doesn't stop the environment runtime in which the kernel is started because you could have started other notebooks in the same environment. You should only stop a notebook runtime if you are sure that no other notebook kernels are active.
Only runtimes that are started for jobs are automatically shut down after the scheduled job has completed. For example, if you schedule to run a notebook once a day for 2 months, the runtime instance will be activated every day for the duration of the scheduled job and deactivated again after the job has finished.
Project users with Admin role can stop all runtimes in the project. Users that are added to the project with Editor role can stop the runtimes that they started, but can't stop other project users' runtimes. Users that are added to the project with the Viewer role can't see the runtimes in the project.
You can stop runtimes from:
- The Environment Runtimes page, which lists all active runtimes across all projects for your account, by clicking Administration > Environment runtimes from the main navigation menu.
- Under Tool runtimes on the Environments page on the Manage tab of your project, which lists the active runtimes for a specific project.
- The Environments page when you click the Notebook Info icon
from the notebook toolbar
in the notebook editor. You can stop the runtime under Runtime status.
Idle timeouts for:
- Jupyter notebook runtimes
- Spark runtimes for notebooks and Data Refinery
- Notebook with GPU runtimes
- RStudio runtimes
Jupyter notebook idle timeout
Runtime idle times differ for the Jupyter notebook runtimes depending on your watsonx.ai Studio plan.
| Plan | Idle timeout |
|---|---|
| Lite | - Idle stop time: 1 hour - CUH limit: 10 CUHs |
| Professional | - Idle stop time: 1 hour - CUH limit: no limit |
| Standard (Legacy) | - Idle stop time: 1 hour - CUH limit: no limit |
| Enterprise (Legacy) | - Idle stop time: 3 hours - CUH limit: no limit |
| All plans Free runtime |
- Idle stop time: 1 hour - Maximum lifetime: 12 hours |
Spark idle timeout
All Spark runtimes, for example for notebooks and Data Refinery, are stopped after 3 hours of inactivity. The Default Data Refinery XS runtime that is used when you refine data in Data Refinery is stopped after an idle time of 1
hour.
Spark runtimes that are started when a job is started are stopped when the job finishes.
GPU idle timeout
All GPU runtimes are automatically stopped after 3 hours of inactivity for Enterprise plan users and after 1 hour of inactivity for other paid plan users.
RStudio idle timeout
An RStudio is stopped for you after an idle time of 2 hours. During this idle time, you will continue to consume CUHs for which you are billed. Long compute-intensive jobs are hard stopped after 24 hours.
Migrating custom environment templates from Runtime 24.1 to Runtime 25.1
In Runtime 25.1, instead of a combination of conda virtual environments and conda package manager, a combination of Python virtual environments and pip package manager is used.
If you are using the default packages of a runtime, you will not notice the change at all. However, if you have performed any kind of package customization (within a Notebook or through a custom environment) in an environment template that is
based on Runtime 24.1 and uses conda, you must migrate it before 16th April 2026.
Additional packages that are installed within a notebook
| Before | After | Notes |
|---|---|---|
You are using pip to install additional packages |
No changes required. You can keep doing so. | |
You are using conda to install additional python packages |
Find the equivalent package on PyPI and use pip to install it. |
If the pip package requires additional system libraries that are not included in the package itself and that are not within the default images, we recommend to first check whether there is a viable alternative that has no
additional requirements for system libraries. If that is not possible, reach out to IBM Support. |
You are using conda to install additional R packages |
Installing additional R packages is not supported any more. |
Reach out to IBM Support if your business case requires additional R packages. |
Additional packages that are installed by using custom environments
| Before | After | Notes |
|---|---|---|
You only used the pip section of the conda.yaml |
Convert to requirements.txt format |
No changes required other than the format change. See example |
You have python package entries outside of the pip section |
Find PyPI replacements for conda packages and add them to requirements.txt |
If the pip package requires additional system libraries that are not included in the package itself and that are not within the default images, we recommend to first check whether there is a viable alternative that has no
additional requirements for system libraries. If that is not possible, reach out to IBM Support. |
You have R package entries outside of the pip section |
Installing additional R packages is not supported any more. |
Reach out to IBM Support if your business case requires additional R packages. |
Moving requirements from conda.yaml to requirements.txt
To move requirements from conda.yaml to requirements.txt, extract all packages from the pip section and then list them in the requirements.txt file (one package in one line):
Old: conda.yaml
dependencies:
- pip:
- pandas==1.5.0
- numpy==1.23.0
- scikit-learn==1.1.0
New: requirements.txt
pandas==1.5.0
numpy==1.23.0
scikit-learn==1.1.0