Exemplo: descarregamento de dados com a geração de um comando de upload com base em um uso do programa MapReduce , com um método de autenticação Kerberos
Relatório de Execução:
[i1055@quickstart ~]$ db2hpu -i i1055 --debug -f sysin
INZM031I Optim High Performance Unload for Db2 06.01.00.001(160126)
64 bits 01/29/2016 (Linux quickstart.cloudera 2.6.32-358.el6.x86_64 #1 SMP Fri Feb 22 00:31:26 UTC 2013 x86_64)
INZI473I Memory limitations: 'unlimited' for virtual memory and 'unlimited' for data segment
----+----1----+----2----+----3----+----4----+----5----+----6----+----7----+----8----+
000001 GLOBAL CONNECT TO SAMPLE;
000002 UNLOAD TABLESPACE
000003 SELECT * FROM EMPLOYEE;
000004 OUTFILE("outfile")
000005 LOADFILE("loadfile")
000006 LOADDEST (HADOOP MAPREDUCE WITH KERBEROS AUTH)
000007 FORMAT DEL;
INZU462I HPU control step start: 01/29/2016 11:33:59.318.
INZU463I HPU control step end : 01/29/2016 11:33:59.385.
INZU464I HPU run step start : 01/29/2016 11:33:59.471.
INZU410I HPU utility has unloaded 42 rows on quickstart host for I1055.EMPLOYEE in
outfile.
INZU684I HPU utility has generated an upload command for the MAPREDUCE destination
in the loadfile file.
INZU465I HPU run step end : 01/29/2016 11:33:59.893.
INZI441I HPU successfully ended: Real time -> 0m0.575369s
User time -> 0m0.620905s : Parent -> 0m0.615906s, Children -> 0m0.004999s
Syst time -> 0m0.030995s : Parent -> 0m0.025996s, Children -> 0m0.004999s
seção MapReduce associada no arquivo db2hpu.dest :
[MapReduce]
hdfspath=/tmp
command="/tmp/CustomMapReduce.jar" --inputfile
user=foo/quickstart.cloudera@CLOUDERA
keytab=/tmp/foo.keytab
Extrair do arquivo de saída gerado:
[i1055@quickstart ~]$ cat outfile
"000010","CHRISTINE","I","HAAS","A00","3978",19950101,"PRES
",18,"F",19630824,+0152750.00,+0001000.00,+0004220.00
...
"200340","ROY","R","ALONZO","E21","5698",19970705,"FIELDREP",
16,"M",19560517,+0031840.00,+0000500.00,+0001907.00
Comando de upload gerado:
[i1055@quickstart ~]$ cat loadfile
#!/bin/sh
kinit -k -t "/tmp/foo.keytab" "foo/quickstart.cloudera@CLOUDERA" > "EMPLOYEE.msg" 2>&1
RC=$?
if [ $RC -ne 0 ]
then
echo "Error while getting Kerberos credential. The 'EMPLOYEE.msg' file
contains the error encountered."
else
rm -f "EMPLOYEE.msg"
unset HADOOP_CLASSPATH
export HADOOP_CLASSPATH=`hbase classpath`:/tmp/CustomMapReduce.jar
echo Start uploading ...
hdfs dfs -put -f "outfile" "/tmp/" > "EMPLOYEE.msg" 2>&1
hadoop jar "/tmp/CustomMapReduce.jar" --inputfile "/tmp/outfile" >> "EMPLOYEE.msg" 2>&1
RC=$?
hdfs dfs -rm "/tmp/outfile" >> "EMPLOYEE.msg" 2>&1
if [ $RC -ne 0 ]
then
echo "An error occurred while processing the 'outfile' file. The
'EMPLOYEE.msg' file contains the associated execution report."
else
echo "The 'outfile' file has been processed successfully. The
'EMPLOYEE.msg' file contains the associated execution report."
fi
fi