Exemple : Déchargement des données avec la génération d'une commande de transfert pour la destination HBase avec une méthode d'authentification Kerberos et une invite de mot de passe

Rapport d'exécution :


[i1055@quickstart ~]$ db2hpu -i i1055 --debug -f sysin
INZM031I Optim High Performance Unload for Db2 06.01.00.001(160126) 
64 bits 01/29/2016 (Linux quickstart.cloudera 2.6.32-358.el6.x86_64 #1 SMP Fri Feb 22 00:31:26 UTC 2013 x86_64)
INZI473I Memory limitations: 'unlimited' for virtual memory and 'unlimited' for data segment
       ----+----1----+----2----+----3----+----4----+----5----+----6----+----7----+----8----+
000001 GLOBAL CONNECT TO SAMPLE;
000002 UNLOAD TABLESPACE
000003 SELECT EMPNO, FIRSTNME, LASTNAME FROM EMPLOYEE;
000004 OUTFILE("outfile")
000005 LOADFILE("loadfile")
000006 LOADDEST (HADOOP HBASE WITH KERBEROS AUTH)
000007 INTO TABLE ("EMPLOYEEDEST" WITH COLUMNS (CF, FIRSTNME:COL_FIRSTNAME:CF2))
000008 FORMAT DEL;

INZU462I HPU control step start: 01/29/2016 11:27:30.200.
INZU463I HPU control step end  : 01/29/2016 11:27:30.272.
INZU464I HPU run step start    : 01/29/2016 11:27:30.317.
INZU410I HPU utility has unloaded 42 rows on quickstart host for I1055.EMPLOYEE in outfile.
INZU684I HPU utility has generated an upload command for the HBASE destination in the loadfile file.
INZU465I HPU run step end      : 01/29/2016 11:27:30.795.
INZI441I HPU successfully ended: Real time -> 0m0.595027s
User time -> 0m0.644901s : Parent -> 0m0.641902s, Children -> 0m0.002999s
Syst time -> 0m0.029995s : Parent -> 0m0.025996s, Children -> 0m0.003999s

Section HBase associée dans le fichier db2hpu.dest :


[HBase]
hdfspath=/tmp
user=foo/quickstart.cloudera@CLOUDERA
#Keytab is commented out for password prompting when executing upload command:
#keytab=/tmp/foo.keytab

Extrayez le fichier de sortie généré :


[i1055@quickstart ~]$ cat outfile
"000010","CHRISTINE","HAAS"
"000020","MICHAEL","THOMPSON"
,,,
"200330","HELENA","WONG"
"200340","ROY","ALONZO"
  

Commande de transfert générée :


[i1055@quickstart ~]$ cat loadfile
#!/bin/sh
kinit "foo/quickstart.cloudera@CLOUDERA" 2> "EMPLOYEEDEST.msg"
RC=$?
if [ $RC -ne 0 ]
then
    echo "Error while getting Kerberos credential. The 'EMPLOYEEDEST.msg' 
          file contains the error encountered."
else
    rm -f "EMPLOYEEDEST.msg"
    unset HADOOP_CLASSPATH
    export HADOOP_CLASSPATH=`hbase classpath`:/opt/ibm/HPU/V12.1/tools/HBaseMapReduce.jar

    echo Start uploading ...
    hdfs dfs -put -f "outfile" "/tmp/" > "EMPLOYEEDEST.msg" 2>&1
    hadoop jar "/opt/ibm/HPU/V12.1/tools/HBaseMapReduce.jar" --tablename "EMPLOYEEDEST" 
    --separator "," --inputtype "DEL" --columns "CF:EMPNO,CF2:COL_FIRSTNAME,CF:LASTNAME" 
    --inputfile "/tmp/outfile" >> "EMPLOYEEDEST.msg" 2>&1
    RC=$?
    hdfs dfs -rm "/tmp/outfile" >> "EMPLOYEEDEST.msg" 2>&1
    if [ $RC -ne 0 ]
    then
        echo "An error occurred while processing the 'outfile' file. The 
              'EMPLOYEEDEST.msg' file contains the associated execution report."
    else
        echo "The 'outfile' file has been processed successfully. The 
              'EMPLOYEEDEST.msg' file contains the associated execution report."
    fi
fi
exit $RC