Known issues and limitations for Db2 Data Management Console
The following known issues and limitations apply to Db2 Data Management Console.
Known issues
- Custom alerts generated do not show in the notification center for Alerts
-
Applies to: 5.2 and 5.1
Custom alerts generated do not show in the notification center for Alerts.
- Upgrading from 4.8.x to 5.1.0 on an FIPS-enabled cluster fails to upgrade the Db2 Data Management Console instance.
-
Applies to: 5.1
Upgrading the Db2 Data Management Console service from version 4.8.x to version 5.1.0 on an FIPS-enabled cluster fails to upgrade the Db2 Data Management Console instance.
- The Db2 Data Management Console fails to load data into Db2
-
Applies to: 5.1
When a user attempts to load data into Db2, it fails with an access denied and File I/O error messages.
As a workaround, complete the following steps:- Connect to the
db2upod. For example, run the following command:oc rsh c-db2oltp-1703681564383095-db2u-0 - Clear the user folder. For example, to clear the
cpdadminfolder, run the following command:rm -rf /mnt/blumeta0/db2/load/cpadmin/ - Change the group ownership of files and directories. For example, run the following
command:
sudo chown -R db2uadm:db2iadm1 /mnt/blumeta0/db2/load/
- Connect to the
Limitations
- The Database availability widget displays incorrect values for availability percentage
-
Applies to: 5.1
The Database availability widget in the Summary page might not display the correct value for availability percentage. The availability percentage is calculated based on historical data. When the repository database is not available for a certain period, the historical data for that period is lost causing the availability percentage value to deviate.
As a workaround, view the database availability alert to understand whether the database is available or not.
- Db2 Data Management Console fails to import large CSV files
-
Applies to: 5.1
The Db2 Data Management Console fails to import large CSV files (300 MB or larger) from your local system.
As a workaround, use Cloud Object Storage or Amazon S3 for importing large CSV files.