IBM Support

PI77683: INSERT INTO A PARTITIONED TABLE IN BIGSQL MAY INTRODUCE INVALID META-INFO THAT CAN CAUSE SQL5105 WHEN READ USING CERTAIN API'S

Subscribe

You can track all active APARs for this component.

 

APAR status

  • Closed as program error.

Error description

  • In bigsql, this issue can happen only if the C++ writer has
    done the insert. (For example, parquet tables and text tables
    use the C++ writer) The defect is that the formatting used by
    this process is not compatible with other API's that may want
    to read this data.  Specifically, if the data is read using
    hive api's, then it may cause a failure like :
    
    Error: java.lang.RuntimeException:
    org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime
    Error while processing row [Error getting row data with
    exception java.lang.ClassCastException:
    org.apache.hadoop.io.BytesWritable cannot be cast to
    org.apache.hadoop.hive.serde2.io.HiveVarcharWritable
    at
    org.apache.hadoop.hive.serde2.objectinspector.primitive.Writable
    HiveVarcharObjectInspector.getPrimitiveJavaObject(WritableHiveVa
    rcharObjectInspector.java:57)
    at
    org.apache.hadoop.hive.serde2.SerDeUtils.buildJSONString(SerDeUt
    ils.java:258)
    at
    org.apache.hadoop.hive.serde2.SerDeUtils.buildJSONString(SerDeUt
    ils.java:354)
    at
    org.apache.hadoop.hive.serde2.SerDeUtils.getJSONString(SerDeUtil
    s.java:198)
    at
    org.apache.hadoop.hive.serde2.SerDeUtils.getJSONString(SerDeUtil
    s.java:184)
    at
    org.apache.hadoop.hive.ql.exec.MapOperator.toErrorMessage(MapOpe
    rator.java:544)
    at
    org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.j
    ava:513)
    at
    org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java
    :163)
    at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
    at
    org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
    at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at
    org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupIn
    formation.java:1692)
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
    
    The issue may not be noticed as long as the C++ reader is the
    one to access the data, so the issue may not be discovered
    until much later, such as a future version, that may be
    accessing the data using the hive java api's.
    
    A known example of this is the newer version analyze tool in
    4.2, which has been known to expose this issue with the invalid
    data, even though this defect has long since been fixed in 4.1
    

Local fix

  • 1. Create hadoop table t2 like t1
    2. insert into t2 select * from t1
    3. drop table t1
    4. rename table t2 to t1
    
    Where t1 is the bad table.
    

Problem summary

  • See error description
    

Problem conclusion

  • The problem is fixed in Version 4.2.0.0 and later fix packs
    

Temporary fix

Comments

APAR Information

  • APAR number

    PI77683

  • Reported component name

    INFO BIGINSIGHT

  • Reported component ID

    5725C0900

  • Reported release

    420

  • Status

    CLOSED PER

  • PE

    NoPE

  • HIPER

    NoHIPER

  • Special Attention

    NoSpecatt / Xsystem

  • Submitted date

    2017-03-06

  • Closed date

    2018-03-01

  • Last modified date

    2018-03-01

  • APAR is sysrouted FROM one or more of the following:

  • APAR is sysrouted TO one or more of the following:

Modules/Macros

  • Unknown
    

Fix information

  • Fixed component name

    INFO BIGINSIGHT

  • Fixed component ID

    5725C0900

Applicable component levels

  • R420 PSY

       UP

[{"Business Unit":{"code":"BU059","label":"IBM Software w\/o TPS"},"Product":{"code":"SSCRJT","label":"IBM Db2 Big SQL"},"Platform":[{"code":"PF025","label":"Platform Independent"}],"Version":"420","Line of Business":{"code":"LOB10","label":"Data and AI"}}]

Document Information

Modified date:
25 August 2020