Running an AutoAI generated notebook on IBM Z and IBM LinuxONE
You can use a notebook generated by running an AutoAI experiment outside of Cloud Pak for Data on IBM Z and IBM LinuxONE hardware. To run the notebook, you must set up the Python environment with the packages required to run the generated notebook by using conda.
Setting up Python environment for zLinux operating system
To run an AutoAI experiment that uses packages which are not available in the default zLinux
conda channel, you must set up your Python environemnt with the required packages by building the required packages and configuring the
environment.
Follow these steps to customize your environment by adding pyarrow
and tensorflow
packages, which are not available in the default zLinux
conda channel,and install AutoAI packages (autoai-libs
and autoai-ts-libs
) to run notebooks generated by AutoAI outside of Cloud Pak for Data on IBM Z and LinuxONE platforms:
-
Install Conda on the zLinux operating system.
-
Prepare the
pyarrow 11.0.0
Python package installation dependencies with Open-CE.Note: This package is not available on the zLinux conda channel.-
Create a conda environment.In base conda environment,
Note:This example uses
/opt
as the working directory for the local conda channel.In base conda environment:
conda install -y conda-build
Then:
cd /opt conda create -y -n pyarrow-env python=3.10 conda activate pyarrow-env conda install -y -c open-ce open-ce-builder
-
Create a Git clone of the Open-CE project and check out the release to the
open-ce-r1.9
branch.git clone -b open-ce-r1.9 https://github.com/open-ce/open-ce.git
-
Install a patch package with the following command:
yum install patch gcc gcc-c++
-
Navigate to the Open-CE project and build the pyarrow dependencies.
cd open-ce open-ce build env --python_versions 3.10 --build_types cpu envs/arrow-env.yaml
-
Check the condabuild folder to find the pyarrow dependencies
ls -1 condabuild/ channeldata.json index.html linux-s390x noarch opence-conda-env-py3.10-cpu-openmpi.yaml ****************************************************** ls -1 condabuild/linux-s390x/ arrow-cpp-11.0.0-py310hb252c34_2_cpu.conda arrow-cpp-proc-11.0.0-cpu.conda boost-cpp-1.65.1-hebff1d6_4.conda cryptography-41.0.4-opence_py310_he9533b4_0.conda cryptography-vectors-41.0.4-py310h77c88a5_0.conda current_repodata.json current_repodata.json.bz2 ffmpeg-4.2.2-opence_0.conda gflags-2.2.2-heb72281_0.conda glog-0.5.0-he499f12_0.conda grpc-cpp-1.41.0-hf453556_pb4.21.12_6.conda index.html libabseil-20230125.0-cxx17_he499f12_1.conda libboost-1.65.1-h896dd0f_4.conda libevent-2.1.10-h8df5d65_2.conda libprotobuf-3.21.12-h6cabbc9_0.conda libprotobuf-static-3.21.12-he0b681d_0.conda libthrift-0.13.0-h333d347_6.conda libvpx-1.11.0-h3de3984_0.conda orc-1.8.2-h9978810_3.conda protobuf-4.21.12-py310h3de3984_1.conda pyarrow-11.0.0-py310h7cdfc66_2_cpu.conda rapidjson-1.1.0-h4cc523a_0.conda repodata_from_packages.json repodata_from_packages.json.bz2 repodata.json repodata.json.bz2 rust-1.71.1-ha869a9c_0.conda rust_linux-s390x-1.71.1-hf505785_1.conda thrift-compiler-0.13.0-h333d347_6.conda thrift-cpp-0.13.0-6.conda tokenizers-0.13.3-opence_py310_heabbc7f_0.conda xsimd-9.0.1-h4cc523a_0.conda yasm-1.3.0-h532a228_2.conda
-
Deactivate the pyarrow-env conda virtual environment and navigate back to the /opt directory.
-
-
Prepare the Tensorflow Python package installation dependencies with Open-CE.
Note: This package is not available on the zLinux conda channel.- Create a conda environment.
cd /opt conda create -y -n tensorflow-env python=3.10 conda activate tensorflow-env conda install -y -c open-ce open-ce-builder
- Navigate to the Open-CE project and build the Tensorflow dependencies.
cd /opt/open-ce open-ce build env --python_versions 3.10 --build_types cpu envs/tensorflow-env.yaml
- Check the condabuild folder to find the tensorflow-cpu dependencies.
ls -1 condabuild/ channeldata.json index.html linux-s390x noarch opence-conda-env-py3.10-cpu-openmpi.yaml ****************************************************** ls -1 condabuild/linux-s390x/ absl-py-1.0.0-py310h77c88a5_0.conda array-record-0.2.0-py310he127c3e_1.conda bazel-5.3.0-h447df78_1.conda bazel-toolchain-0.1.5-h1589012_0.conda black-22.12.0-py310h6d39d64_0.conda clang-14.0.6-0.conda clang-14-14.0.6-default_hc034eec_0.conda clangdev-14.0.6-default_hc034eec_0.conda clang-format-14.0.6-default_hc034eec_0.conda clang-format-14-14.0.6-default_hc034eec_0.conda clang-tools-14.0.6-default_hc034eec_0.conda clangxx-14.0.6-default_h050e89a_0.conda cryptography-41.0.4-opence_py310_he9533b4_0.conda cryptography-vectors-41.0.4-py310h77c88a5_0.conda current_repodata.json current_repodata.json.bz2 dm-tree-0.1.7-py310h93d806b_1.conda ffmpeg-4.2.2-opence_0.conda flatbuffers-2.0.0-he499f12_0.conda grpc-cpp-1.41.0-hf453556_pb4.21.12_6.conda grpcio-1.53.0-py310h40f4e1e_0.conda index.html jax-0.4.7-cpu_py310_2.conda jaxlib-0.4.7-cpu_py310_pb4.21.12_4.conda keras-2.12.0-py310hc450ce1_3.conda libabseil-20230125.0-cxx17_he499f12_1.conda libclang13-14.0.6-default_h99f1993_0.conda libclang-14.0.6-default_hc034eec_0.conda libclang-cpp-14.0.6-default_hc034eec_0.conda libclang-cpp14-14.0.6-default_hc034eec_0.conda libprotobuf-3.21.12-h6cabbc9_0.conda libprotobuf-static-3.21.12-he0b681d_0.conda libtensorflow-2.12.0-he2d1015_cpu_pb4.21.12_4.conda libvpx-1.11.0-h3de3984_0.conda ml_dtypes-0.1.0-py310h9bf4de2_0.conda promise-2.3-py310h6d39d64_0.conda protobuf-4.21.12-py310h3de3984_1.conda repodata_from_packages.json repodata_from_packages.json.bz2 repodata.json repodata.json.bz2 rust-1.71.1-ha869a9c_0.conda rust_linux-s390x-1.71.1-hf505785_1.conda tensorflow-addons-0.19.0-py310h7f2d79a_1_cpu.conda tensorflow-addons-proc-0.19.0-cpu.conda tensorflow-base-2.12.0-cpu_py310_pb4.21.12_4.conda tensorflow-cpu-2.12.0-py310_1.conda tensorflow-model-optimization-0.7.4-py310_2.conda tensorflow-probability-0.19.0-py310_1.conda _tensorflow_select-1.0-cpu_2.conda tensorflow-text-2.12.0-h14c02a0_py310_pb4.21.12_2.conda tokenizers-0.13.3-opence_py310_heabbc7f_0.conda yasm-1.3.0-h532a228_2.conda
- Create a conda environment.
-
Run the conda index command on the condabuild folder.
conda index condabuild
-
Add the following lines in the
~/.condarc
file to make use of thecondabuild
folder as a local conda channel.channels: - /opt/open-ce/condabuild - defaults
-
Check the Conda channel to find the pyarrow and tensorflow-cpu installation package.
conda search pyarrow conda search tensorflow-cpu Loading channels: done # Name Version Build Channel pyarrow 11.0.0 py310h7cdfc66_2_cpu condabuild (tensorflow) [root@recorder1 open-ce]# conda search tensorflow-cpu Loading channels: done # Name Version Build Channel tensorflow-cpu 2.12.0 py310_1 condabuild (tensorflow) [root@recorder1 open-ce]#
-
Deactivate the tensorflow-env conda virtual environment and navigate back to the /opt directory.
conda deactivate cd /opt
-
Create a Conda virtual environment for autoai installation and activate it.
conda create -y -n autoai python=3.10 conda activate autoai
-
Create requirements.linux-s390x.txt file and add these packages to it:
cython<3 matplotlib pandas>=0.24.2,<1.6 numpy>=1.20.3,<1.24 packaging psutil importlib-metadata coverage urllib3<2 xgboost>=1.6.1 typing_extensions pyarrow scikit-learn>=1.0.2,<1.2 tensorflow-cpu>=2.7.0,<2.13 joblib>=0.11 statsmodels<0.14 dill>=0.3.1.1 networkx>=2.5.1 py4j>=0.10.9,<0.10.10 jpype1>=1.3.0 simplejson==3.17.6 PyWavelets pytz traitlets<6 boto3==1.24.28 deprecated==1.2.13 pyyaml==6.0 semver==2.13.0 ijson==3.1.4 munch==2.5.0
-
Install the conda packages from requirements.linux-s390x.txt file.
conda install -y --file requirements.linux-s390x.txt
-
Create requirements.linux-s390x.pypi.txt file and add these packages to it:
gensim==4.1.2 pytest pytest-cov pep8 pytest-pep8 ibm-watson-machine-learning mlxtend>=0.17.0 lale>=0.6.8,<0.8 alchemy-config==1.1.2 alchemy-logging==1.2.1 anytree==2.9.0 jsons==1.3.1 import-tracker==3.1.5
-
Install requirements.linux-s390x.pypi.txt file.
pip install -r requirements.linux-s390x.pypi.txt
-
Install autoai-libs and autoai-ts-libs:
pip install autoai-libs==1.16.2 --no-deps pip install autoai-ts-libs==3.0.23 --no-deps
Learn more
Planning for Cloud Pak for Data on IBM Z and LinuxONE
Parent topic: AutoAI overview