Data movement troubleshooting

Refer to the following section if you are experiencing problems with the data movement feature.

Log files

When data export or import fails, you might refer to the following log files for mappers:
  • on Hadoop: /fastDataMovement/var/log/mapper

Errors: Exception: hive-site.xml cannot be found in classpath and java.lang.NoClassDefFoundError: org/apache/hadoop/hive/ql/...

Solution: Add directories that contain hive-site.xml and Hive jars to HADOOP_CLASSPATH. For example:
export HADOOP_CLASSPATH="/etc/hive/conf/:/usr/share/cmf/cloudera-navigator-server/libs/cdh4/*"

Error com.ibm.nz.fq.common.exception.QArchException: Fetching DB DDL failed: Routine "*RATE_DDL" (specific name "") has returned an error SQLSTATE with diagnostic text "SQL0443 Reason code or token: *AN_TABLE,,SQL0551 Reason cod".. SQLCODE=-443, SQLSTATE=38553, DRIVER=4.24.81.

This error is caused by the lack of required privileges for a Db2® Warehouse user. For more information on these privileges and how to grant them, see this topic.

Tables imported to BigInsights are not visible for BigSQLv3

When you set the following configuration XML property fq.sql.type=hive2, the tables that were created in Hive might not be visible in BigSQLv3. To see the tables in BigSQLv3, you must call a sync objects procedure. Run the following commands:
  1. CALL SYSHADOOP.HCAT_SYNC_OBJECTS('schema_name', '.*', 'a', 'REPLACE', 'CONTINUE');

    For the schema_name value, use the name of your database on Hive.

  2. If you are running this procedure for a specific table, run:
    CALL SYSHADOOP.HCAT_CACHE_SYNC('schema_name','table_name');
    or, if you are running the procedure for all tables in the schema, run the following command:

    CALL SYSHADOOP.HCAT_CACHE_SYNC('schema_name');

You can also set the configuration XML property as follows: (fq.sql.type=bigsql) and use BigSQL in import, which will solve the problem permanently.

Special characters support on Hadoop

Special characters in column names
On Hive: Special characters in column names are supported in Hive version 0.13.0 and higher. In older versions of Hive, when importing a table with special characters in column names, set the fq.sql.metadata property to false. The table will not be created in Hadoop but the table data will be imported. Open the metadata.xml file that holds this data and look for the metadata.sql.create property. The value of this property contains the table definition. Do not edit this definition in the metadata.xml file. Copy it, rename the columns, and run it in Hive to properly create the table. You can avoid this issue by upgrading Hive to version 0.13.0 or higher.
On BigSQL: Special characters in column names might also create problems for BigSQL. Following is a sample error when ! is used in column name:
Caused by: com.ibm.fq.common.exception.hive.HiveException: Unable to execute Hive sql. 
DB2 SQL Error: SQLCODE=-7, SQLSTATE=42601, SQLERRMC=!;all_dt_boundval ( DI, DRIVER=3.70.4
at com.ibm.fq.hive.RemoteHiveQueryRunner.executeUpdate(RemoteHiveQueryRunner.java:104)
at com.ibm.fq.hive.DbEngine.executeUpdate(DbEngine.java:87)
at com.ibm.fq.hive.DbManager.createTableInHive(DbManager.java:99)
at com.ibm.fq.ArchiveManager.importDataToHadoop(ArchiveManager.java:86)
You might also get the following message:
Caused by: com.ibm.db2.jcc.am.SqlSyntaxErrorException: 
DB2 SQL Error: SQLCODE=-104, SQLSTATE=42601, SQLERRMC=INT;st;, DRIVER=3.70.4

Defining separate users for Hive and HDFS

If for any reason you require separate users for HDFS and Hive operations, you can define the fq.hadoop.remote.user parameter in the configuration XML file.

Import fails on Hive with Error message: Cannot create table in Hive. Data has been imported properly. Try creating it manually.

Solutions:
  1. Ensure that the Hadoop user that is executing the import operation has permission for creating table in Hive.
  2. If solution 1 does not work, verify in Hive service configuration that the hive.server2.enable.doAs parameter is set to true. If it is set to false, then change it to true.
  3. If solution 2 cannot be implemented, then you must execute import operations with fq.sql.metadata set to false and then, after import is completed, manually create table in Hive.

Limitations on decimal values in BigInsights

  1. Import fails on BigInsights 3 if a table has a NUMERIC field. The field is mapped to DECIMAL, which is not supported in BigInsights 3. The following message is displayed:
    Decimal type not supported. If you are using BigInsights 3 and you want to map 
    DECIMAL to STRING, add "fq.sql.bigsql.v3compatibility" property to the xml configuration 
    file and set it to true.
    To solve the problem, add the following property to the XML configuration file:
    <property>
    <name>fq.sql.bigsql.v3compatibility</name>
    <value>true</value>
    </property>
  2. BigInsights version 3 and version 4 cannot support imported Db2 tables that contain larger-precision decimal values. Table creation in BigSQL fails with error similar to:
    Exception in thread "main" com.ibm.fq.common.exception.QArchException: 
    Cannot create table in Hive. Data has been imported properly. Try creating it manually
    ....
    Caused by: com.ibm.fq.common.exception.hive.HiveException: Unable to execute Hive sql. 
    DB2 SQL Error: SQLCODE=-604, SQLSTATE=42611, SQLERRMC=DECIMAL(38, 0), DRIVER=3.70.4
    or
    Failed to instantiate extended type class com.ibm.biginsights.catalog.type.DecimalType, 
    for type "decimal(38,0)

    The maximum precision for a decimal value in BigInsights version 3 or version 4 is 31 digits. Db2 and the Hive application both support a maximum of 38 digits of precision. For more information about the BigInsights data type support, see the IBM Knowledge Center.

    A possible workaround is to import tables that use decimal values that have a precision of 31 digits or less.