Patch 09
The description and installation procedures for the enhancements and fixes for patch 09 are provided.
Patch details for wsl-v1231-ppcle-patch-09
This patch includes the following fixes:
Fixed defects
- TS002631109 / TS002611062 - Fixes the port number that's ignored in the JDBC connection string.
- TS002457861 - Fixes a situation when you're resetting a project, the following message, "The project was not committed due to an unexpected error," appears even though no changes are committed.
- TS002132206 - Integrating Watson Studio Local 1.2.3 as a service provider with SAML 2.0 and SITEMINDER as IdentityProvider failed due to the Watson Studio Local 1.2.3 SAML implementation of Contextual.
- TS002434546 - Fixes the inability to create a new Watson Studio Local Model (from file - xml with PMML).
- TS002471398 - When
.Renvironor.Rprofileare put in user home directory to set local environment variables, these variables are now loaded into RStudio. - TS002679807 - "Undefined" error no longer occurs when Create Projects sometimes takes a long time.
- TS002457799 - "Created by Unknown" instead of the actual person's name no longer occurs when you create a project from your enterprise Git repository.
- TS002462770 - Running a script as a job now shows the running user in the run history of the script for projects created from your enterprise Git repository.
- The Watson Studio Local installation now protects the private TLS/SSL key against unauthorized access.
Prerequisites
IBM Watson Studio Local V1.2.3.1 pLinuxLE English Only Multiplatform (CC1GFEN ) must be installed. To download patch 09, go to Fix Central and select wsl-v1231-ppcle-patch-09.
Patch files
The patch contains the following file: wsl_app_patch_v1231_09_v1.0.0.tar.
Pre-installation
No one can use the Watson Studio Local cluster while the patch is being
installed. The Kubelet and docker services will be restarted.
- Download the patch tar file
wsl_app_patch_v1231_09_v1.0.0.tar. The preferred location is the install path name from /wdp/config, such as /ibm. - Ensure that all the nodes of the cluster are running before installing this patch. Also kubelet and docker services should be running on all the nodes.
- If the cluster is using Gluster, ensure that the gluster file system is clean before installing
this patch by running the following
command:
gluster volume status sysibm-adm-ibmdp-assistant-pv detail | grep Online | grep ": N" | wc -lIf the resulting count is larger than 0, then one or more bricks for the volume are not healthy and must be fixed before continuing to install the patch.
- Log in as the root or system administrator who has read/write permissions in the install directory. This script runs the remote scripts by using SSH.
- To get a list of all the available options and examples of usage,
run:./patch_master.sh --help.
Installing the patch
To install the patch
- Move the patch file to the install directory on a Watson Studio Local master node.
- Use tar to extract the patch scripts. It will create a new directory in the install directory
and install the patch files
there.
tar xvf wsl_app_patch_v1231_09_v1.0.0.tar - Change to the patch directory and run the patch_master.sh script by using the following
command:
./patch_master.shIf you have sudo privileges for installing the patch, log in as the sudo user, and then change to the directory wsl_app_patch_v1231 and use the following command:sudo ./patch_master.shEnsure that the sudo user is created on all nodes. Optionally you can create a private key for this user in the ~/.ssh dir to use instead of a user password.
To get a list of all available options and examples of usage, run./patch_master.sh --help - Monitor the progress of the installation. If any issues are encountered, check the logs file. The remote nodes keep log files in the <install_dir> directory.
Post-installation
To verify that the install is successful,
run:
cat /wdp/patch/current_patch_level
A successful install should
display:patch_number=09
patch_version=1.0.0
Restart all Jupyter with GPU user environments
- Run
to view deployments with the old GPU image.kubectl get deployment -n dsx -l type=jupyter-gpu-py35 - Run
to delete deployments running with the old GPU image.kubectl delete deployment -n dsx -l type=jupyter-gpu-py35 - Rebuild all custom images that were built with the GPU image.
Restart all Jupyter 2.7 user environments
- Run
to view any deployments with the old Jupyter 2.7 image.kubectl get deployment -n dsx -l type=jupyter - Run
to delete any deployments running with the old Jupyter2.7 image.kubectl delete deployment -n dsx -l type=jupyter - Rebuild all custom images that were built with the Jupyter 2.7 image
Restart all Jupyter 3.5 user environments
- Run
to view any deployments with the old Jupyter 3.5 image.kubectl get deployment -n dsx -l type=jupyter-py35 - Run
to delete any deployments running with the old Jupyter 3.5 image.kubectl delete deployment -n dsx -l type=jupyter-py35 - Rebuild all custom images that were built with the Jupyter 3.5 image.
Rolling back the patch
Roll back the patch
- Run
./patch_master.sh --rollback