Topic
  • 17 replies
  • Latest Post - ‏2013-10-18T14:48:30Z by aaronglg
SystemAdmin
SystemAdmin
603 Posts

Pinned topic BigInsight Basic edition installation error: Hadoop, Hive failed

‏2013-02-04T06:22:15Z |
Hi,

I was trying to install IBM BigInsight Basic edition on linux box but getting following errors. It shows installation is complete but validation failed for Hadoop, Hive and BigInsight Console.

ERROR DeployManager - hadoop failed
com.ibm.xap.mgmt.DeployException: -- localhost.localdomain=255 (java.io.IOException: exit code: 1 -- "ssh" "-o StrictHostKeyChecking=no" "localhost.localdomain" "/opt/ibm/biginsights/.hdm-stub/bin/managed-format.sh"
/home/biadmin/.bashrc: line 9: /biginsight/opt/ibm/biginsights/conf/biginsights-env.sh: No such file or directory
JVMSHRC559E Failed to create a directory "/tmp/javasharedresources" for the shared class cache
JVMJ9VM015W Initialization error for library j9shr24(11): JVMJ9VM009E J9VMDllMain failed
Could not create the Java virtual machine.

at com.ibm.xap.mgmt.util.ExecUtil.exec(ExecUtil.java:96)
at com.ibm.xap.mgmt.util.ExecUtil.exec(ExecUtil.java:48)
at com.ibm.xap.mgmt.hdm.ExecTask.doTask(ExecTask.java:77)
at com.ibm.xap.mgmt.util.Task.run(Task.java:77)
at com.ibm.xap.mgmt.util.TaskRunner$1.run(TaskRunner.java:52)
)
at com.ibm.xap.mgmt.hdm.HadoopMgmtFacade.calculateResult(HadoopMgmtFacade.java:2072)
at com.ibm.xap.mgmt.hdm.HadoopMgmtFacade.calculateResult(HadoopMgmtFacade.java:2021)
at com.ibm.xap.mgmt.hdm.HadoopMgmtFacade.format(HadoopMgmtFacade.java:463)
at com.ibm.xap.mgmt.hdm.HadoopDeployer.installConfig(HadoopDeployer.java:284)
at com.ibm.xap.mgmt.DeployManager$InstallThread.doInstall(DeployManager.java:2380)
at com.ibm.xap.mgmt.DeployManager$InstallThread.work(DeployManager.java:2415)
at com.ibm.xap.mgmt.DeployManager$WorkerThread.run(DeployManager.java:2332)

It is looking for file in wrong directory. /biginsight/opt/ibm/biginsights/conf/biginsights-env.sh: No such file or directory

This file is available in directory- /opt/ibm/biginsights/conf/biginsights-env.sh

For whole installation it was looking in directory- /opt/ibm/biginsight but only for this step it is looking in /biginsight/opt/ibm/biginsight which doesn't exist.

After Hadoop failure, Hive and console is also failed. Attached is the log. Please help!
Updated on 2013-02-04T08:48:31Z at 2013-02-04T08:48:31Z by YangWeiWei
  • YangWeiWei
    YangWeiWei
    72 Posts

    Re: BigInsight Basic edition installation error: Hadoop, Hive failed

    ‏2013-02-04T08:48:31Z  
    Can you check if the server is using bash shell ? Biginsights installer only supports bash. Try cat /etc/passwd | grep "$biginsights_admin_username"
  • JRV
    JRV
    4 Posts

    Re: BigInsight Basic edition installation error: Hadoop, Hive failed

    ‏2013-05-18T16:16:45Z  
    Can you check if the server is using bash shell ? Biginsights installer only supports bash. Try cat /etc/passwd | grep "$biginsights_admin_username"

    I am also getting similar error but I checked that shell, its bash using Following Command

    bash-3.2# grep -i "biadmin" /etc/passwd
    biadmin:x:501:501::/home/biadmin:/bin/bash
     

    I am using BigInsight Basic Ed 2.0 , Could you please help me here? Thanks in advance.

    Log

    [INFO] Launching installer back end
    [INFO] Running as root, /root/biginsights-basic-linux64_b20121203_1915/installer/bin/install.sh simple-fullinstall.xml
    [INFO] Distribution Vendor : ibm
    [INFO] Progress - Initializing install properties
    [INFO] Progress - 0%
    [INFO] HadoopConf - Hadoop Configuration class is not on classpath
    [INFO] MgmtConfInitializer - Biginsights nodes [localhost.localdomain]
    [INFO] MgmtConfInitializer - install mode : install
    [INFO] MgmtConfInitializer - distro vendor : ibm
    [INFO] MgmtConfInitializer - dfs.name.dir=/hadoop/hdfs/name
    [INFO] MgmtConfInitializer - fs.checkpoint.dir=/hadoop/hdfs/namesecondary
    [INFO] MgmtConfInitializer - default dfs.data.dir=/hadoop/hdfs/data
    [INFO] MgmtConfInitializer - mapred.system.dir=/hadoop/mapred/system
    [INFO] MgmtConfInitializer - mapred.local.dir=/hadoop/mapred/local
    [INFO] MgmtConfInitializer - hadoop.log.dir=/var/ibm/biginsights/hadoop/logs
    [INFO] MgmtConfInitializer - hadoop.proxyuser.biadmin.hosts=*
    [INFO] MgmtConfInitializer - hadoop.proxyuser.biadmin.groups=*
    [INFO] MgmtConfInitializer - datanode is not set
    [INFO] MgmtConfInitializer - localhost.localdomain is NameNode
    [INFO] MgmtConfInitializer - MgmtConfInitializer: localhost.localdomain is Secondary NameNode
    [INFO] MgmtConfInitializer - localhost.localdomain is JobTracker
    [INFO] MgmtConfInitializer - localhost.localdomain is DataNode
    [INFO] MgmtConfInitializer - biginsights.home=/opt/ibm/biginsights
    [INFO] MgmtConfInitializer - biginsights.var=/var/ibm/biginsights
    [INFO] MgmtConfInitializer - mgmt.ssh.config=by_root_ssh
    [INFO] MgmtConfInitializer - mgmt.user=biadmin
    [INFO] MgmtConfInitializer - mgmt.group=biadmin
    [INFO] MgmtConfInitializer - biginsights.virtualnodes=null
    [INFO] HadoopConf - Hadoop conf saved to /root/biginsights-basic-linux64_b20121203_1915/installer/hdm/hadoop-conf-staging
    [INFO] Progress - Check cluster environment
    [INFO] Progress - 2%
    [INFO] HadoopConf - Hadoop Configuration class is not on classpath
    [INFO] Deployer - scan all datanodes ... caculate free space of all attached disks
    [INFO] @localhost.localdomain - Check directory - /hadoop/hdfs/data
    [INFO] @localhost.localdomain - Check disk - /dev/sdb1
    [INFO] @localhost.localdomain - Free space of /dev/sdb1 is 47G
    [INFO] Deployer - (Avg dfs.data.dir capacity) * 0.1 = 5046586573
    [INFO] Deployer - reset dfs.datanode.du.reserved=5046586573
    [INFO] HadoopConf - Hadoop conf saved to /root/biginsights-basic-linux64_b20121203_1915/installer/hdm/hadoop-conf-staging
    [INFO] @localhost.localdomain - Check directories succeed.
    [INFO] Progress - 2%
    [INFO] PriorChecker - Directories Check - succeed
    [INFO] @localhost.localdomain - localhost.localdomain->127.0.0.1 : valid
    [INFO] PriorChecker - Hostname/ip check - succeed
    [INFO] @localhost.localdomain - 61616 : available
    [INFO] @localhost.localdomain - 9999 : available
    [INFO] @localhost.localdomain - 8009 : available
    [INFO] @localhost.localdomain - 50090 : available
    [INFO] @localhost.localdomain - 50070 : available
    [INFO] @localhost.localdomain - 54198 : available
    [INFO] @localhost.localdomain - 50030 : available
    [INFO] @localhost.localdomain - 60030 : available
    [INFO] @localhost.localdomain - 50075 : available
    [INFO] @localhost.localdomain - 50010 : available
    [INFO] @localhost.localdomain - 60010 : available
    [INFO] @localhost.localdomain - 9000 : available
    [INFO] @localhost.localdomain - 9001 : available
    [INFO] @localhost.localdomain - 10080 : available
    [INFO] @localhost.localdomain - 1527 : available
    [INFO] @localhost.localdomain - 1528 : available
    [INFO] @localhost.localdomain - 8080 : available
    [INFO] @localhost.localdomain - 10000 : available
    [INFO] @localhost.localdomain - 8280 : available
    [INFO] @localhost.localdomain - 50020 : available
    [INFO] @localhost.localdomain - 60020 : available
    [INFO] @localhost.localdomain - 60000 : available
    [INFO] @localhost.localdomain - 2181 : available
    [INFO] @localhost.localdomain - 1050 : available
    [INFO] @localhost.localdomain - 8200 : available
    [INFO] @localhost.localdomain - 6882 : available
    [INFO] @localhost.localdomain - 61613 : available
    [INFO] @localhost.localdomain - 9997 : available
    [INFO] @localhost.localdomain - 9998 : available
    [INFO] @localhost.localdomain - 1099 : available
    [INFO] @localhost.localdomain - 2001 : available
    [INFO] @localhost.localdomain - 4201 : available
    [INFO] PriorChecker - Ports check - succeed
    [INFO] @localhost.localdomain - SELINUX - disabled : ok
    [INFO] @localhost.localdomain - OS - Red Hat Enterprise Linux Server release 5.5 (Tikanga) Kernel on an m : supported
    [INFO] Progress - 3%
    [INFO] PriorChecker - Server configuration check - succeed
    [INFO] @localhost.localdomain - Check directory - /hadoop/hdfs/data
    [INFO] @localhost.localdomain - Check disk - /dev/sdb1
    [INFO] @localhost.localdomain - Free space of /dev/sdb1 is 47G
    [INFO] @localhost.localdomain - Check minimal disk space requirement for biginsights installation
    [INFO] @localhost.localdomain - Check disk usage of BIGINSIGHTS_HOME : /opt/ibm/biginsights
    [INFO] @localhost.localdomain - Free space of /dev/sda1 is 36G
    [INFO] @localhost.localdomain - Free space - 38654705664(B) > biginsights.minimal.install.size + totalfree * 0.1 - 9234179686(B) : ok
    [INFO] PriorChecker - Disk space check - succeed
    [INFO] @localhost.localdomain - Check directory - /hadoop/hdfs/data
    [INFO] @localhost.localdomain - Check disk - /dev/sdb1
    [INFO] @localhost.localdomain - Free space of /dev/sdb1 is 47G
    [INFO] @localhost.localdomain - Check datanode disk space requirement
    [INFO] @localhost.localdomain - Free space - 50465865728(B) > dfs.datanode.du.reserved - 5046586573(B) : ok
    [INFO] PriorChecker - Datanode disk space check - succeed
    [INFO] @localhost.localdomain - Program - scp,zip,bash,tar,ssh,unzip : installed
    [INFO] PriorChecker - Requreid software/libraries Check - succeed
    [INFO] NCDaemon - Send message .... 10
    [INFO] NCDaemon - Send message .... 10
    [INFO] NCDaemon - Send message .... 10
    [INFO] @localhost.localdomain - localhost.localdomain >> 9000 : Good
    [INFO] @localhost.localdomain - localhost.localdomain >> 9001 : Good
    [INFO] @localhost.localdomain - localhost.localdomain >> 50010 : Good
    [INFO] PriorChecker - Internal connectivity Check - succeed
    [INFO] Check the current user
    biadmin
    RHEL
    Added existing user biadmin to group biadmin
    [INFO] Running as biadmin, /home/biadmin/__biginsights_install/installer/bin/install.sh
    [INFO] Distribution Vendor : ibm
    [INFO] Extract Java for biadmin...
    [INFO] Check the current user
    biadmin
    [INFO] User login shell : BASH
    [INFO] Using... BIGINSIGHTS_HOME: /opt/ibm/biginsights
    [INFO] Using... BIGINSIGHTS_VAR: /var/ibm/biginsights
    [INFO] Using... SSH CONFIG MODE: by_root_ssh
    [INFO] Using... Biginsights administrator: biadmin
    [INFO] Progress - BigInsights installation response file type: install
    [INFO] HadoopConf - Hadoop Configuration class is not on classpath
    [INFO] Progress - Installing HDM
    [INFO] Progress - 3%
    [INFO] Progress - Preparing JDK package
    [INFO] Progress - 4%
    [INFO] JDK at /opt/ibm/biginsights/hdm/jdk
    [INFO] Progress - Preparing Hadoop package
    [INFO] Progress - 6%
    [INFO] Hadoop at /opt/ibm/biginsights/hdm/IHC
    [INFO] Progress - Configuring password-less SSH
    [INFO] Progress - 8%
    [INFO] HadoopMgmtCmdline - Running configAccountAndSsh /home/biadmin/__biginsights_install/installer/bin/../../artifacts
    [INFO] Cluster - Setup biginsights admin user/group, setup passwordless SSH
    [INFO] @localhost.localdomain - OK, password-less SSH has setup.
    [INFO] Cluster - Skip configure password-less as already setup
    [INFO] DupHostDefender - Add other known names to ~/.ssh/known_hosts.
    [WARN] DupHostDefender - Failed to add localhost.localdomain to ~/.ssh/known_hosts
    [WARN] DupHostDefender - Failed to add 127.0.0.1 to ~/.ssh/known_hosts
    [INFO] Progress - 14%
    [INFO] Install as HADOOP_USER biadmin
    [INFO] Progress - Checking directories permission
    [INFO] Progress - 17%
    [INFO] HadoopMgmtCmdline - Running configDirs
    [INFO] @localhost.localdomain -
    [INFO] Progress - 18%
    [INFO] HadoopMgmtCmdline - Running check32or64
    [INFO] Progress - Deploying IBM Hadoop Cluster
    [INFO] Progress - 18%
    [INFO] HadoopMgmtCmdline - Running deployForceAll
    [INFO] Cluster - Repack ... /opt/ibm/biginsights/hdm/todeploy/hdm.tar.gz
    [INFO] HadoopConf - Reset includes (dfs.hosts)
    [INFO] HadoopConf - Reset includes (mapred.hosts)
    [INFO] HadoopConf - Auto set mapred.fairscheduler.allocation.file=/opt/ibm/biginsights/hadoop-conf/fair-scheduler.xml
    [INFO] HadoopConf - Hadoop conf saved to /opt/ibm/biginsights/hdm/hadoop-conf-staging
    [INFO] Cluster - Repack ... /opt/ibm/biginsights/hdm/todeploy/ihc-conf.tar.gz
    [INFO] Cluster - Repack ... /opt/ibm/biginsights/hdm/todeploy/ihc-conf.tar.gz@localhost.localdomain
    [INFO] Cluster - Repack ... /opt/ibm/biginsights/hdm/todeploy/ihc.tar.gz
    [INFO] @localhost.localdomain - Deploy ... hdm, force
    [INFO] Progress - 25%
    [ERROR] @localhost.localdomain - java.lang.Exception
    at com.ibm.xap.mgmt.util.TaskRunner.run(TaskRunner.java:65)
    at com.ibm.xap.mgmt.hdm.HadoopMgmtFacade.deploy(HadoopMgmtFacade.java:373)
    at com.ibm.xap.mgmt.hdm.HadoopMgmtFacade.deploy(HadoopMgmtFacade.java:398)
    at com.ibm.xap.mgmt.hdm.HadoopMgmtFacade.deployForceAll(HadoopMgmtFacade.java:348)
    at com.ibm.xap.mgmt.hdm.HadoopMgmtCmdline.run(HadoopMgmtCmdline.java:108)
    at com.ibm.xap.mgmt.hdm.HadoopMgmtCmdline.main(HadoopMgmtCmdline.java:36)

    java.io.IOException: Failed to deploy hdm to localhost.localdomain:/opt/ibm/biginsights/.hdm-stub, please check login shell on this node, if it s not bash, try edit /etc/passwd or use chsh to change to bash Detail error: null
    at com.ibm.xap.mgmt.hdm.DeployTask.deployPackage(DeployTask.java:200)
    at com.ibm.xap.mgmt.hdm.DeployTask.doTask(DeployTask.java:80)
    at com.ibm.xap.mgmt.util.Task.run(Task.java:77)
    at com.ibm.xap.mgmt.util.TaskRunner$1.run(TaskRunner.java:52)
    com.ibm.xap.mgmt.DeployException: -- localhost.localdomain=255 (java.io.IOException: Failed to deploy hdm to localhost.localdomain:/opt/ibm/biginsights/.hdm-stub, please check login shell on this node, if it s not bash, try edit /etc/passwd or use chsh to change to bash Detail error: null
    at com.ibm.xap.mgmt.hdm.DeployTask.deployPackage(DeployTask.java:200)
    at com.ibm.xap.mgmt.hdm.DeployTask.doTask(DeployTask.java:80)
    at com.ibm.xap.mgmt.util.Task.run(Task.java:77)
    at com.ibm.xap.mgmt.util.TaskRunner$1.run(TaskRunner.java:52)
    )
    at com.ibm.xap.mgmt.hdm.HadoopMgmtFacade.calculateResult(HadoopMgmtFacade.java:2072)
    at com.ibm.xap.mgmt.hdm.HadoopMgmtFacade.calculateResult(HadoopMgmtFacade.java:2021)
    at com.ibm.xap.mgmt.hdm.HadoopMgmtFacade.deploy(HadoopMgmtFacade.java:374)
    at com.ibm.xap.mgmt.hdm.HadoopMgmtFacade.deploy(HadoopMgmtFacade.java:398)
    at com.ibm.xap.mgmt.hdm.HadoopMgmtFacade.deployForceAll(HadoopMgmtFacade.java:348)
    at com.ibm.xap.mgmt.hdm.HadoopMgmtCmdline.run(HadoopMgmtCmdline.java:108)
    at com.ibm.xap.mgmt.hdm.HadoopMgmtCmdline.main(HadoopMgmtCmdline.java:36)
    [FATAL] deploy failed

  • YangWeiWei
    YangWeiWei
    72 Posts

    Re: BigInsight Basic edition installation error: Hadoop, Hive failed

    ‏2013-05-19T02:02:18Z  

    Hi

    JVMSHRC559E Failed to create a directory "/tmp/javasharedresources" for the shared class cache
    JVMJ9VM015W Initialization error for library j9shr24(11): JVMJ9VM009E J9VMDllMain failed
    Could not create the Java virtual machine.

    That looks like jvm failed to init cache for shared resources in /tmp/javasharedresources directory, because of permission issue, so please remove  /tmp/javasharedresources directory completely from your server, and make sure /tmp directory is assigned with 777 permission. 

    Thanks

  • YangWeiWei
    YangWeiWei
    72 Posts

    Re: BigInsight Basic edition installation error: Hadoop, Hive failed

    ‏2013-05-19T02:16:08Z  
    • JRV
    • ‏2013-05-18T16:16:45Z

    I am also getting similar error but I checked that shell, its bash using Following Command

    bash-3.2# grep -i "biadmin" /etc/passwd
    biadmin:x:501:501::/home/biadmin:/bin/bash
     

    I am using BigInsight Basic Ed 2.0 , Could you please help me here? Thanks in advance.

    Log

    [INFO] Launching installer back end
    [INFO] Running as root, /root/biginsights-basic-linux64_b20121203_1915/installer/bin/install.sh simple-fullinstall.xml
    [INFO] Distribution Vendor : ibm
    [INFO] Progress - Initializing install properties
    [INFO] Progress - 0%
    [INFO] HadoopConf - Hadoop Configuration class is not on classpath
    [INFO] MgmtConfInitializer - Biginsights nodes [localhost.localdomain]
    [INFO] MgmtConfInitializer - install mode : install
    [INFO] MgmtConfInitializer - distro vendor : ibm
    [INFO] MgmtConfInitializer - dfs.name.dir=/hadoop/hdfs/name
    [INFO] MgmtConfInitializer - fs.checkpoint.dir=/hadoop/hdfs/namesecondary
    [INFO] MgmtConfInitializer - default dfs.data.dir=/hadoop/hdfs/data
    [INFO] MgmtConfInitializer - mapred.system.dir=/hadoop/mapred/system
    [INFO] MgmtConfInitializer - mapred.local.dir=/hadoop/mapred/local
    [INFO] MgmtConfInitializer - hadoop.log.dir=/var/ibm/biginsights/hadoop/logs
    [INFO] MgmtConfInitializer - hadoop.proxyuser.biadmin.hosts=*
    [INFO] MgmtConfInitializer - hadoop.proxyuser.biadmin.groups=*
    [INFO] MgmtConfInitializer - datanode is not set
    [INFO] MgmtConfInitializer - localhost.localdomain is NameNode
    [INFO] MgmtConfInitializer - MgmtConfInitializer: localhost.localdomain is Secondary NameNode
    [INFO] MgmtConfInitializer - localhost.localdomain is JobTracker
    [INFO] MgmtConfInitializer - localhost.localdomain is DataNode
    [INFO] MgmtConfInitializer - biginsights.home=/opt/ibm/biginsights
    [INFO] MgmtConfInitializer - biginsights.var=/var/ibm/biginsights
    [INFO] MgmtConfInitializer - mgmt.ssh.config=by_root_ssh
    [INFO] MgmtConfInitializer - mgmt.user=biadmin
    [INFO] MgmtConfInitializer - mgmt.group=biadmin
    [INFO] MgmtConfInitializer - biginsights.virtualnodes=null
    [INFO] HadoopConf - Hadoop conf saved to /root/biginsights-basic-linux64_b20121203_1915/installer/hdm/hadoop-conf-staging
    [INFO] Progress - Check cluster environment
    [INFO] Progress - 2%
    [INFO] HadoopConf - Hadoop Configuration class is not on classpath
    [INFO] Deployer - scan all datanodes ... caculate free space of all attached disks
    [INFO] @localhost.localdomain - Check directory - /hadoop/hdfs/data
    [INFO] @localhost.localdomain - Check disk - /dev/sdb1
    [INFO] @localhost.localdomain - Free space of /dev/sdb1 is 47G
    [INFO] Deployer - (Avg dfs.data.dir capacity) * 0.1 = 5046586573
    [INFO] Deployer - reset dfs.datanode.du.reserved=5046586573
    [INFO] HadoopConf - Hadoop conf saved to /root/biginsights-basic-linux64_b20121203_1915/installer/hdm/hadoop-conf-staging
    [INFO] @localhost.localdomain - Check directories succeed.
    [INFO] Progress - 2%
    [INFO] PriorChecker - Directories Check - succeed
    [INFO] @localhost.localdomain - localhost.localdomain->127.0.0.1 : valid
    [INFO] PriorChecker - Hostname/ip check - succeed
    [INFO] @localhost.localdomain - 61616 : available
    [INFO] @localhost.localdomain - 9999 : available
    [INFO] @localhost.localdomain - 8009 : available
    [INFO] @localhost.localdomain - 50090 : available
    [INFO] @localhost.localdomain - 50070 : available
    [INFO] @localhost.localdomain - 54198 : available
    [INFO] @localhost.localdomain - 50030 : available
    [INFO] @localhost.localdomain - 60030 : available
    [INFO] @localhost.localdomain - 50075 : available
    [INFO] @localhost.localdomain - 50010 : available
    [INFO] @localhost.localdomain - 60010 : available
    [INFO] @localhost.localdomain - 9000 : available
    [INFO] @localhost.localdomain - 9001 : available
    [INFO] @localhost.localdomain - 10080 : available
    [INFO] @localhost.localdomain - 1527 : available
    [INFO] @localhost.localdomain - 1528 : available
    [INFO] @localhost.localdomain - 8080 : available
    [INFO] @localhost.localdomain - 10000 : available
    [INFO] @localhost.localdomain - 8280 : available
    [INFO] @localhost.localdomain - 50020 : available
    [INFO] @localhost.localdomain - 60020 : available
    [INFO] @localhost.localdomain - 60000 : available
    [INFO] @localhost.localdomain - 2181 : available
    [INFO] @localhost.localdomain - 1050 : available
    [INFO] @localhost.localdomain - 8200 : available
    [INFO] @localhost.localdomain - 6882 : available
    [INFO] @localhost.localdomain - 61613 : available
    [INFO] @localhost.localdomain - 9997 : available
    [INFO] @localhost.localdomain - 9998 : available
    [INFO] @localhost.localdomain - 1099 : available
    [INFO] @localhost.localdomain - 2001 : available
    [INFO] @localhost.localdomain - 4201 : available
    [INFO] PriorChecker - Ports check - succeed
    [INFO] @localhost.localdomain - SELINUX - disabled : ok
    [INFO] @localhost.localdomain - OS - Red Hat Enterprise Linux Server release 5.5 (Tikanga) Kernel on an m : supported
    [INFO] Progress - 3%
    [INFO] PriorChecker - Server configuration check - succeed
    [INFO] @localhost.localdomain - Check directory - /hadoop/hdfs/data
    [INFO] @localhost.localdomain - Check disk - /dev/sdb1
    [INFO] @localhost.localdomain - Free space of /dev/sdb1 is 47G
    [INFO] @localhost.localdomain - Check minimal disk space requirement for biginsights installation
    [INFO] @localhost.localdomain - Check disk usage of BIGINSIGHTS_HOME : /opt/ibm/biginsights
    [INFO] @localhost.localdomain - Free space of /dev/sda1 is 36G
    [INFO] @localhost.localdomain - Free space - 38654705664(B) > biginsights.minimal.install.size + totalfree * 0.1 - 9234179686(B) : ok
    [INFO] PriorChecker - Disk space check - succeed
    [INFO] @localhost.localdomain - Check directory - /hadoop/hdfs/data
    [INFO] @localhost.localdomain - Check disk - /dev/sdb1
    [INFO] @localhost.localdomain - Free space of /dev/sdb1 is 47G
    [INFO] @localhost.localdomain - Check datanode disk space requirement
    [INFO] @localhost.localdomain - Free space - 50465865728(B) > dfs.datanode.du.reserved - 5046586573(B) : ok
    [INFO] PriorChecker - Datanode disk space check - succeed
    [INFO] @localhost.localdomain - Program - scp,zip,bash,tar,ssh,unzip : installed
    [INFO] PriorChecker - Requreid software/libraries Check - succeed
    [INFO] NCDaemon - Send message .... 10
    [INFO] NCDaemon - Send message .... 10
    [INFO] NCDaemon - Send message .... 10
    [INFO] @localhost.localdomain - localhost.localdomain >> 9000 : Good
    [INFO] @localhost.localdomain - localhost.localdomain >> 9001 : Good
    [INFO] @localhost.localdomain - localhost.localdomain >> 50010 : Good
    [INFO] PriorChecker - Internal connectivity Check - succeed
    [INFO] Check the current user
    biadmin
    RHEL
    Added existing user biadmin to group biadmin
    [INFO] Running as biadmin, /home/biadmin/__biginsights_install/installer/bin/install.sh
    [INFO] Distribution Vendor : ibm
    [INFO] Extract Java for biadmin...
    [INFO] Check the current user
    biadmin
    [INFO] User login shell : BASH
    [INFO] Using... BIGINSIGHTS_HOME: /opt/ibm/biginsights
    [INFO] Using... BIGINSIGHTS_VAR: /var/ibm/biginsights
    [INFO] Using... SSH CONFIG MODE: by_root_ssh
    [INFO] Using... Biginsights administrator: biadmin
    [INFO] Progress - BigInsights installation response file type: install
    [INFO] HadoopConf - Hadoop Configuration class is not on classpath
    [INFO] Progress - Installing HDM
    [INFO] Progress - 3%
    [INFO] Progress - Preparing JDK package
    [INFO] Progress - 4%
    [INFO] JDK at /opt/ibm/biginsights/hdm/jdk
    [INFO] Progress - Preparing Hadoop package
    [INFO] Progress - 6%
    [INFO] Hadoop at /opt/ibm/biginsights/hdm/IHC
    [INFO] Progress - Configuring password-less SSH
    [INFO] Progress - 8%
    [INFO] HadoopMgmtCmdline - Running configAccountAndSsh /home/biadmin/__biginsights_install/installer/bin/../../artifacts
    [INFO] Cluster - Setup biginsights admin user/group, setup passwordless SSH
    [INFO] @localhost.localdomain - OK, password-less SSH has setup.
    [INFO] Cluster - Skip configure password-less as already setup
    [INFO] DupHostDefender - Add other known names to ~/.ssh/known_hosts.
    [WARN] DupHostDefender - Failed to add localhost.localdomain to ~/.ssh/known_hosts
    [WARN] DupHostDefender - Failed to add 127.0.0.1 to ~/.ssh/known_hosts
    [INFO] Progress - 14%
    [INFO] Install as HADOOP_USER biadmin
    [INFO] Progress - Checking directories permission
    [INFO] Progress - 17%
    [INFO] HadoopMgmtCmdline - Running configDirs
    [INFO] @localhost.localdomain -
    [INFO] Progress - 18%
    [INFO] HadoopMgmtCmdline - Running check32or64
    [INFO] Progress - Deploying IBM Hadoop Cluster
    [INFO] Progress - 18%
    [INFO] HadoopMgmtCmdline - Running deployForceAll
    [INFO] Cluster - Repack ... /opt/ibm/biginsights/hdm/todeploy/hdm.tar.gz
    [INFO] HadoopConf - Reset includes (dfs.hosts)
    [INFO] HadoopConf - Reset includes (mapred.hosts)
    [INFO] HadoopConf - Auto set mapred.fairscheduler.allocation.file=/opt/ibm/biginsights/hadoop-conf/fair-scheduler.xml
    [INFO] HadoopConf - Hadoop conf saved to /opt/ibm/biginsights/hdm/hadoop-conf-staging
    [INFO] Cluster - Repack ... /opt/ibm/biginsights/hdm/todeploy/ihc-conf.tar.gz
    [INFO] Cluster - Repack ... /opt/ibm/biginsights/hdm/todeploy/ihc-conf.tar.gz@localhost.localdomain
    [INFO] Cluster - Repack ... /opt/ibm/biginsights/hdm/todeploy/ihc.tar.gz
    [INFO] @localhost.localdomain - Deploy ... hdm, force
    [INFO] Progress - 25%
    [ERROR] @localhost.localdomain - java.lang.Exception
    at com.ibm.xap.mgmt.util.TaskRunner.run(TaskRunner.java:65)
    at com.ibm.xap.mgmt.hdm.HadoopMgmtFacade.deploy(HadoopMgmtFacade.java:373)
    at com.ibm.xap.mgmt.hdm.HadoopMgmtFacade.deploy(HadoopMgmtFacade.java:398)
    at com.ibm.xap.mgmt.hdm.HadoopMgmtFacade.deployForceAll(HadoopMgmtFacade.java:348)
    at com.ibm.xap.mgmt.hdm.HadoopMgmtCmdline.run(HadoopMgmtCmdline.java:108)
    at com.ibm.xap.mgmt.hdm.HadoopMgmtCmdline.main(HadoopMgmtCmdline.java:36)

    java.io.IOException: Failed to deploy hdm to localhost.localdomain:/opt/ibm/biginsights/.hdm-stub, please check login shell on this node, if it s not bash, try edit /etc/passwd or use chsh to change to bash Detail error: null
    at com.ibm.xap.mgmt.hdm.DeployTask.deployPackage(DeployTask.java:200)
    at com.ibm.xap.mgmt.hdm.DeployTask.doTask(DeployTask.java:80)
    at com.ibm.xap.mgmt.util.Task.run(Task.java:77)
    at com.ibm.xap.mgmt.util.TaskRunner$1.run(TaskRunner.java:52)
    com.ibm.xap.mgmt.DeployException: -- localhost.localdomain=255 (java.io.IOException: Failed to deploy hdm to localhost.localdomain:/opt/ibm/biginsights/.hdm-stub, please check login shell on this node, if it s not bash, try edit /etc/passwd or use chsh to change to bash Detail error: null
    at com.ibm.xap.mgmt.hdm.DeployTask.deployPackage(DeployTask.java:200)
    at com.ibm.xap.mgmt.hdm.DeployTask.doTask(DeployTask.java:80)
    at com.ibm.xap.mgmt.util.Task.run(Task.java:77)
    at com.ibm.xap.mgmt.util.TaskRunner$1.run(TaskRunner.java:52)
    )
    at com.ibm.xap.mgmt.hdm.HadoopMgmtFacade.calculateResult(HadoopMgmtFacade.java:2072)
    at com.ibm.xap.mgmt.hdm.HadoopMgmtFacade.calculateResult(HadoopMgmtFacade.java:2021)
    at com.ibm.xap.mgmt.hdm.HadoopMgmtFacade.deploy(HadoopMgmtFacade.java:374)
    at com.ibm.xap.mgmt.hdm.HadoopMgmtFacade.deploy(HadoopMgmtFacade.java:398)
    at com.ibm.xap.mgmt.hdm.HadoopMgmtFacade.deployForceAll(HadoopMgmtFacade.java:348)
    at com.ibm.xap.mgmt.hdm.HadoopMgmtCmdline.run(HadoopMgmtCmdline.java:108)
    at com.ibm.xap.mgmt.hdm.HadoopMgmtCmdline.main(HadoopMgmtCmdline.java:36)
    [FATAL] deploy failed

    Hi JRV

    Need more information here

    1. What is the bash version ? bash --version

    2. What is the OS version ? cat /etc/issue

    3. Can you paste your /ect/hosts file here ?

    4. Can you run a simple command, ts=`date +%Y%m%d%H%M.%S` && echo $ts on the server to see if get an output like 201305181908.59

  • JRV
    JRV
    4 Posts

    Re: BigInsight Basic edition installation error: Hadoop, Hive failed

    ‏2013-05-19T12:47:04Z  

    Hi JRV

    Need more information here

    1. What is the bash version ? bash --version

    2. What is the OS version ? cat /etc/issue

    3. Can you paste your /ect/hosts file here ?

    4. Can you run a simple command, ts=`date +%Y%m%d%H%M.%S` && echo $ts on the server to see if get an output like 201305181908.59

    Thanks YangWeiWei, Plz find the requested info below

    1. What is the bash version ? bash --version

    GNU bash, version 3.2.25(1)-release (x86_64-redhat-linux-gnu)
    Copyright (C) 2005 Free Software Foundation, Inc.
     

    2. What is the OS version ? cat /etc/issue

    Red Hat Enterprise Linux Server release 5.5 (Tikanga)
    Kernel \r on an \m
     

    3. Can you paste your /ect/hosts file here ?

    127.0.0.1 localhost.localdomain localhost
    127.0.0.1    localhost.localdomain    localhost
    192.168.59.174    localhost.localdomain    localhost

    4. Can you run a simple command, ts=`date +%Y%m%d%H%M.%S` && echo $ts on the server to see if get an output like 201305181908.59

    201305190840.27
     

     

    Plz let me know if anything else you need.

  • YangWeiWei
    YangWeiWei
    72 Posts

    Re: BigInsight Basic edition installation error: Hadoop, Hive failed

    ‏2013-05-20T05:22:02Z  

    Hi JRV

    Need more information here

    1. What is the bash version ? bash --version

    2. What is the OS version ? cat /etc/issue

    3. Can you paste your /ect/hosts file here ?

    4. Can you run a simple command, ts=`date +%Y%m%d%H%M.%S` && echo $ts on the server to see if get an output like 201305181908.59

    Thanks for the info. I noticed that is a lower version of bash than we normally used, if possible, please upgrade bash to version 4+. Before that, can you also check if cksum is installed on this machine ? Try cksum --version

  • JRV
    JRV
    4 Posts

    Re: BigInsight Basic edition installation error: Hadoop, Hive failed

    ‏2013-05-20T17:09:41Z  

    Thanks for the info. I noticed that is a lower version of bash than we normally used, if possible, please upgrade bash to version 4+. Before that, can you also check if cksum is installed on this machine ? Try cksum --version

    Thanks YangWeiWei,

    I am not sure how to update this linux bash as I had downloaded this linux VM from BigDataUniversity link. When I tried to to use software updater of this VM, it asked me to enter installation code about which I have no idea. How can I update the bash directly, I am not linux expert. Plz guide/

     

    Thanks in advance.

    here is checksum info.

     

    bash-3.2# cksum --version
    cksum (coreutils) 5.97
    Copyright (C) 2006 Free Software Foundation, Inc.
    This is free software.  You may redistribute copies of it under the terms of
    the GNU General Public License <http://www.gnu.org/licenses/gpl.html>.
    There is NO WARRANTY, to the extent permitted by law.

    Written by Q. Frank Xia.
     

  • YangWeiWei
    YangWeiWei
    72 Posts

    Re: BigInsight Basic edition installation error: Hadoop, Hive failed

    ‏2013-05-21T02:42:31Z  
    • JRV
    • ‏2013-05-20T17:09:41Z

    Thanks YangWeiWei,

    I am not sure how to update this linux bash as I had downloaded this linux VM from BigDataUniversity link. When I tried to to use software updater of this VM, it asked me to enter installation code about which I have no idea. How can I update the bash directly, I am not linux expert. Plz guide/

     

    Thanks in advance.

    here is checksum info.

     

    bash-3.2# cksum --version
    cksum (coreutils) 5.97
    Copyright (C) 2006 Free Software Foundation, Inc.
    This is free software.  You may redistribute copies of it under the terms of
    the GNU General Public License <http://www.gnu.org/licenses/gpl.html>.
    There is NO WARRANTY, to the extent permitted by law.

    Written by Q. Frank Xia.
     

    Interesting ... if you downloaded the VM from BigDataUniversity, that should already have BI installed, why you still try to install it ? 

  • JRV
    JRV
    4 Posts

    Re: BigInsight Basic edition installation error: Hadoop, Hive failed

    ‏2013-05-22T18:45:18Z  

    Interesting ... if you downloaded the VM from BigDataUniversity, that should already have BI installed, why you still try to install it ? 

    I am following their tutorial and I think its not installed, I will check again one more time... Or else  I would have to try another flavor of Linux.

     

  • YangWeiWei
    YangWeiWei
    72 Posts

    Re: BigInsight Basic edition installation error: Hadoop, Hive failed

    ‏2013-05-24T02:21:52Z  
    • JRV
    • ‏2013-05-22T18:45:18Z

    I am following their tutorial and I think its not installed, I will check again one more time... Or else  I would have to try another flavor of Linux.

     

    OK, let me know if there is still any issue

  • aaronglg
    aaronglg
    14 Posts

    Re: BigInsight Basic edition installation error: Hadoop, Hive failed

    ‏2013-10-17T16:18:45Z  
    Can you check if the server is using bash shell ? Biginsights installer only supports bash. Try cat /etc/passwd | grep "$biginsights_admin_username"

    HI Yangweiwei,

     

    I am installing the Quick Start Edition, but I still face a problem. In the log, the progress shows 100%, however, at the end of the log there is still some problems. You can see the details in the log.

    Problem coming with it:

    1.I can open the console with localhost:8080, but on this it shows that the HIVE is not running and unavailable.

    2. When I try to enable Eclipse IDE, it cannot work ,the edition of Eclipse is Eclipse IDE 3.6.1 for Java

     

    I installed with silent installation, Is my problem fatal? How can I fix it?

     

    Thank you, hope to hear from you soon. My email address is aaronglg@gmail.com

  • YangWeiWei
    YangWeiWei
    72 Posts

    Re: BigInsight Basic edition installation error: Hadoop, Hive failed

    ‏2013-10-17T16:32:13Z  
    • aaronglg
    • ‏2013-10-17T16:18:45Z

    HI Yangweiwei,

     

    I am installing the Quick Start Edition, but I still face a problem. In the log, the progress shows 100%, however, at the end of the log there is still some problems. You can see the details in the log.

    Problem coming with it:

    1.I can open the console with localhost:8080, but on this it shows that the HIVE is not running and unavailable.

    2. When I try to enable Eclipse IDE, it cannot work ,the edition of Eclipse is Eclipse IDE 3.6.1 for Java

     

    I installed with silent installation, Is my problem fatal? How can I fix it?

     

    Thank you, hope to hear from you soon. My email address is aaronglg@gmail.com

    I saw following error in the log

    Reading task controller config from /var/bi-task-controller-conf/taskcontroller.cfg
    Can't get group information for 500

    that possibly means you have somehow messed up gid for your biginsights administrator, could you try

    id bhall ?

  • aaronglg
    aaronglg
    14 Posts

    Re: BigInsight Basic edition installation error: Hadoop, Hive failed

    ‏2013-10-17T17:13:46Z  

    I saw following error in the log

    Reading task controller config from /var/bi-task-controller-conf/taskcontroller.cfg
    Can't get group information for 500

    that possibly means you have somehow messed up gid for your biginsights administrator, could you try

    id bhall ?

    Hello, sir

     

    I have also asked DiLi, and the answer he asked and the outcome I run is like this, hope you can help

    Q:Could you log on to the BI Console node, then su to the BigInsights Administrator user (the linux user that owns the $BIGINSIGHTS_HOME), then manually run the following from cli, see if they finish successfully?
    healthchech.sh hbase
    healthcheck.sh hadoop

    Also, please attach the install xml file used for your silent install, and the hadoop namenode and job tracker logs.

     

    A:

    Thank you for your reply. I log on the console start.sh, and then run the healthcheck.sh hadoop, it shows the following:

    [bhall@w520-cardwell bin]$ healthcheck.sh hadoop
    [INFO] DeployCmdline - [ IBM InfoSphere BigInsights QuickStart Edition ]
    [INFO] Progress - Health check hadoop
    [INFO] Deployer - Running Hadoop terasort example
    [INFO] Progress - 100%
    [ERROR] DeployManager -
    [ERROR] DeployManager - OPERATION ERROR -- Health check [hadoop]:
    [ERROR] DeployManager - -------------------------------------------------------
    [ERROR] DeployManager - hadoop failed
    java.io.IOException: exit code: 2 -- "/var/opt/ibm/builds/myBIInstall/biginsights/hdm/bin/hdm" "checkdeploy"
    [INFO] Progress - Checking Hadoop cluster started
    [INFO] DeployCmdline - [ IBM InfoSphere BigInsights QuickStart Edition ]
    [INFO] Progress - Start hadoop
    [INFO] @localhost - namenode already running, pid 3982
    [INFO] @localhost - secondarynamenode already running, pid 4231
    [INFO] @localhost - datanode already running, pid 4384
    [INFO] Progress - 50%
    [INFO] Deployer - Waiting for Namenode to exit safe mode...
    [INFO] Deployer - HDFS cluster started successfully
    [INFO] @localhost - jobtracker already running, pid 5000
    [INFO] @localhost - tasktracker already running, pid 5210
    [INFO] Progress - 100%
    [INFO] Deployer - MapReduce cluster started successfully
    [INFO] DeployManager - Start; SUCCEEDED components: [hadoop]; Consumes : 2657ms
    [INFO] Progress - Waiting for exit of safe mode
    [INFO] HadoopMgmtCmdline - Running safemode wait
    [INFO] Progress - Running terasort example
    >> /var/opt/ibm/builds/myBIInstall/biginsights/IHC/bin/hadoop dfs -rmr /hdm-tera-input /hdm-tera-output /hdm-tera-report
    rmr: cannot remove /hdm-tera-input: No such file or directory.
    rmr: cannot remove /hdm-tera-output: No such file or directory.
    rmr: cannot remove /hdm-tera-report: No such file or directory.
    >> /var/opt/ibm/builds/myBIInstall/biginsights/IHC/bin/hadoop jar /var/opt/ibm/builds/myBIInstall/biginsights/IHC/hadoop-examples-1.1.1.jar teragen -Dmapred.map.tasks=1 -Ddfs.block.size=536870912 -Dmapred.map.child.java.opts=-Xmx1024m -Dmapred.reduce.child.java.opts=-Xmx2048m 10000 /hdm-tera-input
    Generating 10000 using 1 maps with step of 10000
    13/10/17 12:49:08 INFO mapred.JobClient: Running job: job_201310171115_0001
    13/10/17 12:49:09 INFO mapred.JobClient:  map 0% reduce 0%
    13/10/17 12:49:09 INFO mapred.JobClient: Task Id : attempt_201310171115_0001_m_000002_0, Status : FAILED
    Error initializing attempt_201310171115_0001_m_000002_0:
    java.io.IOException: Job initialization failed (24) with output: Reading task controller config from /var/bi-task-controller-conf/taskcontroller.cfg
    Can't get group information for 500 - Success.

        at org.apache.hadoop.mapred.LinuxTaskController.initializeJob(LinuxTaskController.java:193)
        at org.apache.hadoop.mapred.TaskTracker$4.run(TaskTracker.java:1331)
        at java.security.AccessController.doPrivileged(AccessController.java:310)
        at javax.security.auth.Subject.doAs(Subject.java:573)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
        at org.apache.hadoop.mapred.TaskTracker.initializeJob(TaskTracker.java:1306)
        at org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:1221)
        at org.apache.hadoop.mapred.TaskTracker$5.run(TaskTracker.java:2581)
        at java.lang.Thread.run(Thread.java:738)
    Caused by: org.apache.hadoop.util.Shell$ExitCodeException:
        at org.apache.hadoop.util.Shell.runCommand(Shell.java:255)
        at org.apache.hadoop.util.Shell.run(Shell.java:182)
        at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:375)
        at org.apache.hadoop.mapred.LinuxTaskController.initializeJob(LinuxTaskController.java:186)
        ... 8 more

    13/10/17 12:49:09 WARN mapred.JobClient: Error reading task outputhttp://localhost:50060/tasklog?plaintext=true&attemptid=attempt_201310171115_0001_m_000002_0&filter=stdout
    13/10/17 12:49:09 WARN mapred.JobClient: Error reading task outputhttp://localhost:50060/tasklog?plaintext=true&attemptid=attempt_201310171115_0001_m_000002_0&filter=stderr
    13/10/17 12:49:09 INFO mapred.JobClient: Task Id : attempt_201310171115_0001_m_000002_1, Status : FAILED
    Error initializing attempt_201310171115_0001_m_000002_1:
    java.io.IOException: Job initialization failed (24) with output: Reading task controller config from /var/bi-task-controller-conf/taskcontroller.cfg
    Can't get group information for 500 - Success.

        at org.apache.hadoop.mapred.LinuxTaskController.initializeJob(LinuxTaskController.java:193)
        at org.apache.hadoop.mapred.TaskTracker$4.run(TaskTracker.java:1331)
        at java.security.AccessController.doPrivileged(AccessController.java:310)
        at javax.security.auth.Subject.doAs(Subject.java:573)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
        at org.apache.hadoop.mapred.TaskTracker.initializeJob(TaskTracker.java:1306)
        at org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:1221)
        at org.apache.hadoop.mapred.TaskTracker$5.run(TaskTracker.java:2581)
        at java.lang.Thread.run(Thread.java:738)
    Caused by: org.apache.hadoop.util.Shell$ExitCodeException:
        at org.apache.hadoop.util.Shell.runCommand(Shell.java:255)
        at org.apache.hadoop.util.Shell.run(Shell.java:182)
        at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:375)
        at org.apache.hadoop.mapred.LinuxTaskController.initializeJob(LinuxTaskController.java:186)
        ... 8 more

    13/10/17 12:49:09 WARN mapred.JobClient: Error reading task outputhttp://localhost:50060/tasklog?plaintext=true&attemptid=attempt_201310171115_0001_m_000002_1&filter=stdout
    13/10/17 12:49:09 WARN mapred.JobClient: Error reading task outputhttp://localhost:50060/tasklog?plaintext=true&attemptid=attempt_201310171115_0001_m_000002_1&filter=stderr
    13/10/17 12:49:09 INFO mapred.JobClient: Task Id : attempt_201310171115_0001_m_000002_2, Status : FAILED
    Error initializing attempt_201310171115_0001_m_000002_2:
    java.io.IOException: Job initialization failed (24) with output: Reading task controller config from /var/bi-task-controller-conf/taskcontroller.cfg
    Can't get group information for 500 - Success.

        at org.apache.hadoop.mapred.LinuxTaskController.initializeJob(LinuxTaskController.java:193)
        at org.apache.hadoop.mapred.TaskTracker$4.run(TaskTracker.java:1331)
        at java.security.AccessController.doPrivileged(AccessController.java:310)
        at javax.security.auth.Subject.doAs(Subject.java:573)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
        at org.apache.hadoop.mapred.TaskTracker.initializeJob(TaskTracker.java:1306)
        at org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:1221)
        at org.apache.hadoop.mapred.TaskTracker$5.run(TaskTracker.java:2581)
        at java.lang.Thread.run(Thread.java:738)
    Caused by: org.apache.hadoop.util.Shell$ExitCodeException:
        at org.apache.hadoop.util.Shell.runCommand(Shell.java:255)
        at org.apache.hadoop.util.Shell.run(Shell.java:182)
        at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:375)
        at org.apache.hadoop.mapred.LinuxTaskController.initializeJob(LinuxTaskController.java:186)
        ... 8 more

    13/10/17 12:49:09 WARN mapred.JobClient: Error reading task outputhttp://localhost:50060/tasklog?plaintext=true&attemptid=attempt_201310171115_0001_m_000002_2&filter=stdout
    13/10/17 12:49:09 WARN mapred.JobClient: Error reading task outputhttp://localhost:50060/tasklog?plaintext=true&attemptid=attempt_201310171115_0001_m_000002_2&filter=stderr
    13/10/17 12:49:09 INFO mapred.JobClient: Task Id : attempt_201310171115_0001_m_000001_0, Status : FAILED
    Error initializing attempt_201310171115_0001_m_000001_0:
    java.io.IOException: Job initialization failed (24) with output: Reading task controller config from /var/bi-task-controller-conf/taskcontroller.cfg
    Can't get group information for 500 - Success.

        at org.apache.hadoop.mapred.LinuxTaskController.initializeJob(LinuxTaskController.java:193)
        at org.apache.hadoop.mapred.TaskTracker$4.run(TaskTracker.java:1331)
        at java.security.AccessController.doPrivileged(AccessController.java:310)
        at javax.security.auth.Subject.doAs(Subject.java:573)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
        at org.apache.hadoop.mapred.TaskTracker.initializeJob(TaskTracker.java:1306)
        at org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:1221)
        at org.apache.hadoop.mapred.TaskTracker$5.run(TaskTracker.java:2581)
        at java.lang.Thread.run(Thread.java:738)
    Caused by: org.apache.hadoop.util.Shell$ExitCodeException:
        at org.apache.hadoop.util.Shell.runCommand(Shell.java:255)
        at org.apache.hadoop.util.Shell.run(Shell.java:182)
        at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:375)
        at org.apache.hadoop.mapred.LinuxTaskController.initializeJob(LinuxTaskController.java:186)
        ... 8 more

    13/10/17 12:49:09 WARN mapred.JobClient: Error reading task outputhttp://localhost:50060/tasklog?plaintext=true&attemptid=attempt_201310171115_0001_m_000001_0&filter=stdout
    13/10/17 12:49:09 WARN mapred.JobClient: Error reading task outputhttp://localhost:50060/tasklog?plaintext=true&attemptid=attempt_201310171115_0001_m_000001_0&filter=stderr
    13/10/17 12:49:09 INFO mapred.JobClient: Task Id : attempt_201310171115_0001_m_000001_1, Status : FAILED
    Error initializing attempt_201310171115_0001_m_000001_1:
    java.io.IOException: Job initialization failed (24) with output: Reading task controller config from /var/bi-task-controller-conf/taskcontroller.cfg
    Can't get group information for 500 - Success.

        at org.apache.hadoop.mapred.LinuxTaskController.initializeJob(LinuxTaskController.java:193)
        at org.apache.hadoop.mapred.TaskTracker$4.run(TaskTracker.java:1331)
        at java.security.AccessController.doPrivileged(AccessController.java:310)
        at javax.security.auth.Subject.doAs(Subject.java:573)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
        at org.apache.hadoop.mapred.TaskTracker.initializeJob(TaskTracker.java:1306)
        at org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:1221)
        at org.apache.hadoop.mapred.TaskTracker$5.run(TaskTracker.java:2581)
        at java.lang.Thread.run(Thread.java:738)
    Caused by: org.apache.hadoop.util.Shell$ExitCodeException:
        at org.apache.hadoop.util.Shell.runCommand(Shell.java:255)
        at org.apache.hadoop.util.Shell.run(Shell.java:182)
        at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:375)
        at org.apache.hadoop.mapred.LinuxTaskController.initializeJob(LinuxTaskController.java:186)
        ... 8 more

    13/10/17 12:49:09 WARN mapred.JobClient: Error reading task outputhttp://localhost:50060/tasklog?plaintext=true&attemptid=attempt_201310171115_0001_m_000001_1&filter=stdout
    13/10/17 12:49:09 WARN mapred.JobClient: Error reading task outputhttp://localhost:50060/tasklog?plaintext=true&attemptid=attempt_201310171115_0001_m_000001_1&filter=stderr
    13/10/17 12:49:09 INFO mapred.JobClient: Task Id : attempt_201310171115_0001_m_000001_2, Status : FAILED
    Error initializing attempt_201310171115_0001_m_000001_2:
    java.io.IOException: Job initialization failed (24) with output: Reading task controller config from /var/bi-task-controller-conf/taskcontroller.cfg
    Can't get group information for 500 - Success.

        at org.apache.hadoop.mapred.LinuxTaskController.initializeJob(LinuxTaskController.java:193)
        at org.apache.hadoop.mapred.TaskTracker$4.run(TaskTracker.java:1331)
        at java.security.AccessController.doPrivileged(AccessController.java:310)
        at javax.security.auth.Subject.doAs(Subject.java:573)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
        at org.apache.hadoop.mapred.TaskTracker.initializeJob(TaskTracker.java:1306)
        at org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:1221)
        at org.apache.hadoop.mapred.TaskTracker$5.run(TaskTracker.java:2581)
        at java.lang.Thread.run(Thread.java:738)
    Caused by: org.apache.hadoop.util.Shell$ExitCodeException:
        at org.apache.hadoop.util.Shell.runCommand(Shell.java:255)
        at org.apache.hadoop.util.Shell.run(Shell.java:182)
        at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:375)
        at org.apache.hadoop.mapred.LinuxTaskController.initializeJob(LinuxTaskController.java:186)
        ... 8 more

    13/10/17 12:49:09 WARN mapred.JobClient: Error reading task outputhttp://localhost:50060/tasklog?plaintext=true&attemptid=attempt_201310171115_0001_m_000001_2&filter=stdout
    13/10/17 12:49:09 WARN mapred.JobClient: Error reading task outputhttp://localhost:50060/tasklog?plaintext=true&attemptid=attempt_201310171115_0001_m_000001_2&filter=stderr
    13/10/17 12:49:09 INFO mapred.JobClient: Job complete: job_201310171115_0001
    13/10/17 12:49:09 INFO mapred.JobClient: Counters: 4
    13/10/17 12:49:09 INFO mapred.JobClient:   Job Counters
    13/10/17 12:49:09 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=0
    13/10/17 12:49:09 INFO mapred.JobClient:     Total time spent by all reduces waiting after reserving slots (ms)=0
    13/10/17 12:49:09 INFO mapred.JobClient:     Total time spent by all maps waiting after reserving slots (ms)=0
    13/10/17 12:49:09 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=0
    13/10/17 12:49:09 INFO mapred.JobClient: Job Failed: JobCleanup Task Failure, Task: task_201310171115_0001_m_000001
    java.io.IOException: Job failed!
        at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1327)
        at org.apache.hadoop.examples.terasort.TeraGen.run(TeraGen.java:352)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
        at org.apache.hadoop.examples.terasort.TeraGen.main(TeraGen.java:357)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:60)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:37)
        at java.lang.reflect.Method.invoke(Method.java:611)
        at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
        at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
        at org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:64)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:60)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:37)
        at java.lang.reflect.Method.invoke(Method.java:611)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
    >> /var/opt/ibm/builds/myBIInstall/biginsights/IHC/bin/hadoop jar /var/opt/ibm/builds/myBIInstall/biginsights/IHC/hadoop-examples-1.1.1.jar terasort -Dmapred.reduce.tasks=4 -Ddfs.block.size=536870912 -Dmapred.map.child.java.opts=-Xmx1024m -Dmapred.reduce.child.java.opts=-Xmx2048m -Dio.sort.record.percent=0.17 /hdm-tera-input /hdm-tera-output
    13/10/17 12:49:10 INFO terasort.TeraSort: starting
    13/10/17 12:49:10 INFO mapred.FileInputFormat: Total input paths to process : 0
    java.lang.ArithmeticException: divide by zero
        at org.apache.hadoop.examples.terasort.TeraInputFormat.writePartitionFile(TeraInputFormat.java:118)
        at org.apache.hadoop.examples.terasort.TeraSort.run(TeraSort.java:243)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
        at org.apache.hadoop.examples.terasort.TeraSort.main(TeraSort.java:257)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:60)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:37)
        at java.lang.reflect.Method.invoke(Method.java:611)
        at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
        at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
        at org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:64)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:60)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:37)
        at java.lang.reflect.Method.invoke(Method.java:611)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
    >> /var/opt/ibm/builds/myBIInstall/biginsights/IHC/bin/hadoop jar /var/opt/ibm/builds/myBIInstall/biginsights/IHC/hadoop-examples-1.1.1.jar teravalidate -Dmapred.map.child.java.opts=-Xmx1024m -Dmapred.reduce.child.java.opts=-Xmx2048m /hdm-tera-output /hdm-tera-report
    13/10/17 12:49:12 INFO mapred.JobClient: Cleaning up the staging area hdfs://localhost:9000/user/bhall/.staging/job_201310171115_0002
    13/10/17 12:49:12 ERROR security.UserGroupInformation: PriviledgedActionException as:bhall cause:org.apache.hadoop.mapred.InvalidInputException: Input path does not exist: hdfs://localhost:9000/hdm-tera-output
    org.apache.hadoop.mapred.InvalidInputException: Input path does not exist: hdfs://localhost:9000/hdm-tera-output
        at org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:197)
        at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:208)
        at org.apache.hadoop.examples.terasort.TeraInputFormat.getSplits(TeraInputFormat.java:209)
        at org.apache.hadoop.mapred.JobClient.writeOldSplits(JobClient.java:1051)
        at org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:1043)
        at org.apache.hadoop.mapred.JobClient.access$700(JobClient.java:179)
        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:959)
        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:912)
        at java.security.AccessController.doPrivileged(AccessController.java:310)
        at javax.security.auth.Subject.doAs(Subject.java:573)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
        at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:912)
        at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:886)
        at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1323)
        at org.apache.hadoop.examples.terasort.TeraValidate.run(TeraValidate.java:145)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
        at org.apache.hadoop.examples.terasort.TeraValidate.main(TeraValidate.java:153)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:60)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:37)
        at java.lang.reflect.Method.invoke(Method.java:611)
        at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
        at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
        at org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:64)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:60)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:37)
        at java.lang.reflect.Method.invoke(Method.java:611)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
    >> /var/opt/ibm/builds/myBIInstall/biginsights/IHC/bin/hadoop dfs -ls /hdm-tera-report
    ls: Cannot access /hdm-tera-report: No such file or directory.
    >> /var/opt/ibm/builds/myBIInstall/biginsights/IHC/bin/hadoop dfs -rmr /hdm-tera-input /hdm-tera-output /hdm-tera-report
    Deleted hdfs://localhost:9000/hdm-tera-input
    rmr: cannot remove /hdm-tera-output: No such file or directory.
    rmr: cannot remove /hdm-tera-report: No such file or directory.
    [INFO] =============== Summary of Hadoop Installation ===============
    [INFO] TeraSort ..................................Failed

        at com.ibm.xap.mgmt.util.ExecUtil.exec(ExecUtil.java:101)
        at com.ibm.xap.mgmt.util.ExecUtil.exec(ExecUtil.java:41)
        at com.ibm.xap.mgmt.hdm.HadoopDeployer.healthCheck(HadoopDeployer.java:816)
        at com.ibm.xap.mgmt.DeployManager$HealthCheckThread.work(DeployManager.java:2657)
        at com.ibm.xap.mgmt.DeployManager$WorkerThread.run(DeployManager.java:2387)
    [ERROR] DeployManager - Health check; SUCCEEDED components: []; FAILED components: [hadoop]; Consumes : 17541ms
    Error exit.

     

    Then I run the healthcheck.sh hbase, seems this is OK, it shows the following:

    [bhall@w520-cardwell bin]$  healthcheck.sh hbase
    [INFO] DeployCmdline - [ IBM InfoSphere BigInsights QuickStart Edition ]
    [INFO] Progress - Health check hbase
    [INFO] Deployer - Try to start hbase if hbase service is stopped...
    [INFO] Deployer - Double check whether hbase is started successfully...
    [INFO] @localhost - hbase-master(active) started, pid 6135
    [INFO] @localhost - hbase-regionserver started, pid 6277
    [INFO] Deployer - hbase service started
    [INFO] Deployer - hbase service is healthy
    [INFO] Progress - 100%
    [INFO] DeployManager - Health check; SUCCEEDED components: [hbase]; Consumes : 25317ms

     

    You can check my xml in the attachment.

  • aaronglg
    aaronglg
    14 Posts

    Re: BigInsight Basic edition installation error: Hadoop, Hive failed

    ‏2013-10-17T19:30:19Z  

    I saw following error in the log

    Reading task controller config from /var/bi-task-controller-conf/taskcontroller.cfg
    Can't get group information for 500

    that possibly means you have somehow messed up gid for your biginsights administrator, could you try

    id bhall ?

    Hi Weiwei,

    I have tried, the following is the info:

    uid=500(bhall) gid=500(bhall) groups=500(bhall),10(wheel),503(shadow)

     

    Thank you!

  • aaronglg
    aaronglg
    14 Posts

    Re: BigInsight Basic edition installation error: Hadoop, Hive failed

    ‏2013-10-18T01:57:07Z  

    I saw following error in the log

    Reading task controller config from /var/bi-task-controller-conf/taskcontroller.cfg
    Can't get group information for 500

    that possibly means you have somehow messed up gid for your biginsights administrator, could you try

    id bhall ?

    Hello, I checked the gid and it is really 500 for bhall.

     

  • aaronglg
    aaronglg
    14 Posts

    Re: BigInsight Basic edition installation error: Hadoop, Hive failed

    ‏2013-10-18T13:36:58Z  

    I saw following error in the log

    Reading task controller config from /var/bi-task-controller-conf/taskcontroller.cfg
    Can't get group information for 500

    that possibly means you have somehow messed up gid for your biginsights administrator, could you try

    id bhall ?

    Hi sir I defined a user to biadmin, with group number of 168 still cannot install the hive.

    I installed with Installation pane this time. The progress is 100% however with erros of [ERROR] DeployManager -

    [ERROR] DeployManager - hadoop failed

     

    The log is in the attachment, same problem of :Reading task controller config from /var/bi-task-controller-conf/taskcontroller.cfg
    Can't get group information for 168 - Success.

    I entered into this file and see this:

    mapred.tasktracker.tasks.sleeptime-before-sigkill=#sleep time before sig kill is to be sent to process group after sigterm is sent. Should be in seconds
    hadoop.log.dir=/var/ibm/biginsights/hadoop/logs
    mapred.local.dir=/hadoop/mapred/local
    mapreduce.tasktracker.group=168
    min.user.id=100

    and I checked the group ID for biadmin it is like this:

    uid=503(biadmin) gid=168(biadmin) groups=168(biadmin),503(shadow)


     

    What can I do? I am installing a single node, and server is localhost, is it because of that?

    Do I need to change the group of ID into 503(shadow)?

    Thank you.

    Attachments

  • aaronglg
    aaronglg
    14 Posts

    Re: BigInsight Basic edition installation error: Hadoop, Hive failed

    ‏2013-10-18T14:48:30Z  

    I saw following error in the log

    Reading task controller config from /var/bi-task-controller-conf/taskcontroller.cfg
    Can't get group information for 500

    that possibly means you have somehow messed up gid for your biginsights administrator, could you try

    id bhall ?

    Hi sir Thank you I have solved the problem, but sorry I still have a problem after successful installation.

    In the console, the HIVE still cannot work:

    Hive Node:
    localhost:10000
    Hive Node Status:
    Unavailable
    Hive Node Process ID:
    88486
    Hive Web Interface:
    localhost:9999/hwi
    Hive Web Interface Status:
    Unavailable
    Hive Web Interface Process ID:
    88182
    JDBC URL:  

    Can you help me with it?