Topic
  • 11 replies
  • Latest Post - ‏2014-01-03T22:51:00Z by 6V3M_sab_chandra
H02F_Mandy_Ng
H02F_Mandy_Ng
1 Post

Pinned topic Problem installing BigInsight basic edition

‏2012-07-17T10:17:01Z |
Hi,

I followed the big data university video to install the BigInsight Basic Edition on my macbook. However, I had this error message which I need help to resolve:

Extracting Java ....
Java extraction complete, using JAVA_HOME=/Users/Desktop/biginsights-basic-linux64_b20120604_2018/_jvm/
sed: 1: "./installer-console/var ...": invalid command code .
Verifying port 8300 availability
port 8300 available
-n Starting BigInsights Installer
-n .
-n .
-n .
-n .
-n .
-n .
tail: /Users/Desktop/biginsights-basic-linux64_b20120604_2018/installer-console/var/log/server.out: No such file or directory

Can anyone advice on this?

Thanks very much.

Best regards,
Mandy
Updated on 2013-01-05T04:22:34Z at 2013-01-05T04:22:34Z by YangWeiWei
  • SystemAdmin
    SystemAdmin
    603 Posts

    Re: Problem installing BigInsight basic edition

    ‏2012-07-20T16:56:00Z  
    Hi Mandy,

    We only support the following operating systems for installing BigInsights:
    - Red Hat Enterprise Linux
    - SUSE Linux Enterprise Server

    Thank you,

    Zach
  • SystemAdmin
    SystemAdmin
    603 Posts

    Re: Problem installing BigInsight basic edition

    ‏2013-01-03T08:32:08Z  
    I'm trying to RHEL6 and faced the same problem:

    Extracting Java ....
    Java extraction complete, using JAVA_HOME=/root/biginsights-basic-linux64_b20121203_1915/_jvm/
    Verifying port 8300 availability
    port 8300 available
    Starting BigInsights Installer ......tail: cannot open `/root/biginsights-basic-linux64_b20121203_1915/installer-console/var/log/server.out' for reading: No such file or directory

    Please, advice
  • YangWeiWei
    YangWeiWei
    72 Posts

    Re: Problem installing BigInsight basic edition

    ‏2013-01-05T04:22:34Z  
    I'm trying to RHEL6 and faced the same problem:

    Extracting Java ....
    Java extraction complete, using JAVA_HOME=/root/biginsights-basic-linux64_b20121203_1915/_jvm/
    Verifying port 8300 availability
    port 8300 available
    Starting BigInsights Installer ......tail: cannot open `/root/biginsights-basic-linux64_b20121203_1915/installer-console/var/log/server.out' for reading: No such file or directory

    Please, advice
    This looks like a permission issue, as you extracted the installer under root, are you running the installer with root user ?
  • MMakati
    MMakati
    4 Posts

    Re: Problem installing BigInsight basic edition

    ‏2013-05-19T06:49:39Z  
    This looks like a permission issue, as you extracted the installer under root, are you running the installer with root user ?

    Same problem here. When I go to the directory there are only 3 files on the .../log folder. client-log4j.properties, deployer-log4j.properties and serverlog4j.properties

     

    I cannot see the server.out file. Thanks

  • YangWeiWei
    YangWeiWei
    72 Posts

    Re: Problem installing BigInsight basic edition

    ‏2013-05-19T13:50:56Z  
    • MMakati
    • ‏2013-05-19T06:49:39Z

    Same problem here. When I go to the directory there are only 3 files on the .../log folder. client-log4j.properties, deployer-log4j.properties and serverlog4j.properties

     

    I cannot see the server.out file. Thanks

    What user you used to extract Biginsights package ? Please use the same user to run start.sh

  • MMakati
    MMakati
    4 Posts

    Re: Problem installing BigInsight basic edition

    ‏2013-05-20T09:00:01Z  

     

    IUI0005E: The installation failed with fatal error: System pre-check fails, some prerequisite is not fulfilled. Check log for detail..

    and the installation log file:

    [INFO] Launching installer back end
    [INFO] Running as biadmin, /home/biadmin/Desktop/sharing/biginsights-basic-linux64_b20121203_1915/installer/bin/install.sh simple-fullinstall.xml
    [INFO] Distribution Vendor : ibm
    [INFO] Progress - Initializing install properties
    [INFO] Progress - 0%
    [INFO] HadoopConf - Hadoop Configuration class is not on classpath
    [INFO] MgmtConfInitializer - Biginsights nodes [localhost.localdomain]
    [INFO] MgmtConfInitializer - install mode : install
    [INFO] MgmtConfInitializer - distro vendor : ibm
    [INFO] MgmtConfInitializer - dfs.name.dir=/hadoop/hdfs/name
    [INFO] MgmtConfInitializer - fs.checkpoint.dir=/hadoop/hdfs/namesecondary
    [INFO] MgmtConfInitializer - default dfs.data.dir=/hadoop/hdfs/data
    [INFO] MgmtConfInitializer - mapred.system.dir=/hadoop/mapred/system
    [INFO] MgmtConfInitializer - mapred.local.dir=/hadoop/mapred/local
    [INFO] MgmtConfInitializer - hadoop.log.dir=/var/ibm/biginsights/hadoop/logs
    [INFO] MgmtConfInitializer - datanode is not set
    [INFO] MgmtConfInitializer - localhost.localdomain is NameNode
    [INFO] MgmtConfInitializer - MgmtConfInitializer: localhost.localdomain is Secondary NameNode
    [INFO] MgmtConfInitializer - localhost.localdomain is JobTracker
    [INFO] MgmtConfInitializer - localhost.localdomain is DataNode
    [INFO] MgmtConfInitializer - hadoop.proxyuser.biadmin.hosts=*
    [INFO] MgmtConfInitializer - hadoop.proxyuser.biadmin.groups=*
    [INFO] HadoopConf - Hadoop Configuration class is not on classpath
    [INFO] MgmtConfInitializer - Detect user group: biadmin
    [INFO] MgmtConfInitializer - biginsights.home=/opt/ibm/biginsights
    [INFO] MgmtConfInitializer - biginsights.var=/var/ibm/biginsights
    [INFO] MgmtConfInitializer - mgmt.ssh.config=assumed
    [INFO] MgmtConfInitializer - mgmt.user=biadmin
    [INFO] MgmtConfInitializer - mgmt.group=biadmin
    [INFO] MgmtConfInitializer - biginsights.virtualnodes=null
    [INFO] HadoopConf - Hadoop conf saved to /home/biadmin/Desktop/sharing/biginsights-basic-linux64_b20121203_1915/installer/hdm/hadoop-conf-staging
    [INFO] Progress - Check cluster environment
    [INFO] Progress - 2%
    [INFO] HadoopConf - Hadoop Configuration class is not on classpath
    [INFO] Deployer - scan all datanodes ... caculate free space of all attached disks
    [INFO] @localhost.localdomain - Check directory - /hadoop/hdfs/data
    [INFO] @localhost.localdomain - Check disk - /dev/sda2
    [INFO] @localhost.localdomain - Free space of /dev/sda2 is 15G
    [INFO] Deployer - (Avg dfs.data.dir capacity) * 0.1 = 1610612736
    [INFO] Deployer - reset dfs.datanode.du.reserved=1610612736
    [INFO] HadoopConf - Hadoop conf saved to /home/biadmin/Desktop/sharing/biginsights-basic-linux64_b20121203_1915/installer/hdm/hadoop-conf-staging
    [INFO] @localhost.localdomain - Check directories succeed.
    [INFO] Progress - 2%
    [INFO] PriorChecker - Directories Check - succeed
    [INFO] @localhost.localdomain - localhost.localdomain->127.0.0.1 : valid
    [INFO] PriorChecker - Hostname/ip check - succeed
    [INFO] @localhost.localdomain - 61616 : available
    [INFO] @localhost.localdomain - 9999 : available
    [INFO] @localhost.localdomain - 8009 : available
    [INFO] @localhost.localdomain - 50090 : available
    [INFO] @localhost.localdomain - 50070 : available
    [INFO] @localhost.localdomain - 54198 : available
    [INFO] @localhost.localdomain - 50030 : available
    [INFO] @localhost.localdomain - 60030 : available
    [INFO] @localhost.localdomain - 50075 : available
    [INFO] @localhost.localdomain - 50010 : available
    [INFO] @localhost.localdomain - 60010 : available
    [INFO] @localhost.localdomain - 9000 : available
    [INFO] @localhost.localdomain - 9001 : available
    [INFO] @localhost.localdomain - 10080 : available
    [INFO] @localhost.localdomain - 1527 : available
    [INFO] @localhost.localdomain - 1528 : available
    [INFO] @localhost.localdomain - 8080 : available
    [INFO] @localhost.localdomain - 10000 : available
    [INFO] @localhost.localdomain - 8280 : available
    [INFO] @localhost.localdomain - 50020 : available
    [INFO] @localhost.localdomain - 60020 : available
    [INFO] @localhost.localdomain - 60000 : available
    [INFO] @localhost.localdomain - 2181 : available
    [INFO] @localhost.localdomain - 1050 : available
    [INFO] @localhost.localdomain - 8200 : available
    [INFO] @localhost.localdomain - 6882 : available
    [INFO] @localhost.localdomain - 61613 : available
    [INFO] @localhost.localdomain - 9997 : available
    [INFO] @localhost.localdomain - 9998 : available
    [INFO] @localhost.localdomain - 1099 : available
    [INFO] @localhost.localdomain - 2001 : available
    [INFO] @localhost.localdomain - 4201 : available
    [INFO] PriorChecker - Ports check - succeed
    [INFO] Progress - 3%
    [ERROR] PriorChecker - Server configuration check - failed on localhost.localdomain
    [ERROR] PriorChecker - 255 -- com.ibm.xap.mgmt.ConfigurationException: selinux configuration not compatible - SELINUX=enforcing
    at com.ibm.xap.mgmt.hdm.ServerConfigurationTask.checkSELinuxIsDisabled(ServerConfigurationTask.java:251)
    at com.ibm.xap.mgmt.hdm.ServerConfigurationTask.doTask(ServerConfigurationTask.java:134)
    at com.ibm.xap.mgmt.util.Task.run(Task.java:77)
    at com.ibm.xap.mgmt.util.TaskRunner$1.run(TaskRunner.java:52)
     
    [INFO] @localhost.localdomain - Check directory - /hadoop/hdfs/data
    [INFO] @localhost.localdomain - Check disk - /dev/sda2
    [INFO] @localhost.localdomain - Free space of /dev/sda2 is 15G
    [INFO] @localhost.localdomain - Check minimal disk space requirement for biginsights installation
    [INFO] @localhost.localdomain - Check disk usage of BIGINSIGHTS_HOME : /opt/ibm/biginsights
    [INFO] @localhost.localdomain - Free space of /dev/sda2 is 15G
    [INFO] @localhost.localdomain - Free space - 16106127360(B) > biginsights.minimal.install.size + totalfree * 0.1 - 6979321856(B) : ok
    [INFO] PriorChecker - Disk space check - succeed
    [INFO] @localhost.localdomain - Check directory - /hadoop/hdfs/data
    [INFO] @localhost.localdomain - Check disk - /dev/sda2
    [INFO] @localhost.localdomain - Free space of /dev/sda2 is 15G
    [INFO] @localhost.localdomain - Check datanode disk space requirement
    [INFO] @localhost.localdomain - Free space - 16106127360(B) > dfs.datanode.du.reserved - 1610612736(B) : ok
    [INFO] PriorChecker - Datanode disk space check - succeed
    [INFO] @localhost.localdomain - Program - scp,zip,bash,tar,ssh,unzip : installed
    [INFO] PriorChecker - Requreid software/libraries Check - succeed
    [WARN] @localhost.localdomain - skip localhost.localdomain because netcat/nc is not installed
    [WARN] @localhost.localdomain - skip localhost.localdomain because netcat/nc is not installed
    [WARN] @localhost.localdomain - skip localhost.localdomain because netcat/nc is not installed
    [INFO] PriorChecker - Internal connectivity Check - succeed
    [FATAL] System pre-check fails, some prerequisite is not fulfilled. Check log for detail.

     

    and it says localhost.localdomain  => status = error.

    This is the problem that I got :( Hope you can help me to solve this one.

     

     

  • YangWeiWei
    YangWeiWei
    72 Posts

    Re: Problem installing BigInsight basic edition

    ‏2013-05-20T09:47:38Z  
    • MMakati
    • ‏2013-05-20T09:00:01Z

     

    IUI0005E: The installation failed with fatal error: System pre-check fails, some prerequisite is not fulfilled. Check log for detail..

    and the installation log file:

    [INFO] Launching installer back end
    [INFO] Running as biadmin, /home/biadmin/Desktop/sharing/biginsights-basic-linux64_b20121203_1915/installer/bin/install.sh simple-fullinstall.xml
    [INFO] Distribution Vendor : ibm
    [INFO] Progress - Initializing install properties
    [INFO] Progress - 0%
    [INFO] HadoopConf - Hadoop Configuration class is not on classpath
    [INFO] MgmtConfInitializer - Biginsights nodes [localhost.localdomain]
    [INFO] MgmtConfInitializer - install mode : install
    [INFO] MgmtConfInitializer - distro vendor : ibm
    [INFO] MgmtConfInitializer - dfs.name.dir=/hadoop/hdfs/name
    [INFO] MgmtConfInitializer - fs.checkpoint.dir=/hadoop/hdfs/namesecondary
    [INFO] MgmtConfInitializer - default dfs.data.dir=/hadoop/hdfs/data
    [INFO] MgmtConfInitializer - mapred.system.dir=/hadoop/mapred/system
    [INFO] MgmtConfInitializer - mapred.local.dir=/hadoop/mapred/local
    [INFO] MgmtConfInitializer - hadoop.log.dir=/var/ibm/biginsights/hadoop/logs
    [INFO] MgmtConfInitializer - datanode is not set
    [INFO] MgmtConfInitializer - localhost.localdomain is NameNode
    [INFO] MgmtConfInitializer - MgmtConfInitializer: localhost.localdomain is Secondary NameNode
    [INFO] MgmtConfInitializer - localhost.localdomain is JobTracker
    [INFO] MgmtConfInitializer - localhost.localdomain is DataNode
    [INFO] MgmtConfInitializer - hadoop.proxyuser.biadmin.hosts=*
    [INFO] MgmtConfInitializer - hadoop.proxyuser.biadmin.groups=*
    [INFO] HadoopConf - Hadoop Configuration class is not on classpath
    [INFO] MgmtConfInitializer - Detect user group: biadmin
    [INFO] MgmtConfInitializer - biginsights.home=/opt/ibm/biginsights
    [INFO] MgmtConfInitializer - biginsights.var=/var/ibm/biginsights
    [INFO] MgmtConfInitializer - mgmt.ssh.config=assumed
    [INFO] MgmtConfInitializer - mgmt.user=biadmin
    [INFO] MgmtConfInitializer - mgmt.group=biadmin
    [INFO] MgmtConfInitializer - biginsights.virtualnodes=null
    [INFO] HadoopConf - Hadoop conf saved to /home/biadmin/Desktop/sharing/biginsights-basic-linux64_b20121203_1915/installer/hdm/hadoop-conf-staging
    [INFO] Progress - Check cluster environment
    [INFO] Progress - 2%
    [INFO] HadoopConf - Hadoop Configuration class is not on classpath
    [INFO] Deployer - scan all datanodes ... caculate free space of all attached disks
    [INFO] @localhost.localdomain - Check directory - /hadoop/hdfs/data
    [INFO] @localhost.localdomain - Check disk - /dev/sda2
    [INFO] @localhost.localdomain - Free space of /dev/sda2 is 15G
    [INFO] Deployer - (Avg dfs.data.dir capacity) * 0.1 = 1610612736
    [INFO] Deployer - reset dfs.datanode.du.reserved=1610612736
    [INFO] HadoopConf - Hadoop conf saved to /home/biadmin/Desktop/sharing/biginsights-basic-linux64_b20121203_1915/installer/hdm/hadoop-conf-staging
    [INFO] @localhost.localdomain - Check directories succeed.
    [INFO] Progress - 2%
    [INFO] PriorChecker - Directories Check - succeed
    [INFO] @localhost.localdomain - localhost.localdomain->127.0.0.1 : valid
    [INFO] PriorChecker - Hostname/ip check - succeed
    [INFO] @localhost.localdomain - 61616 : available
    [INFO] @localhost.localdomain - 9999 : available
    [INFO] @localhost.localdomain - 8009 : available
    [INFO] @localhost.localdomain - 50090 : available
    [INFO] @localhost.localdomain - 50070 : available
    [INFO] @localhost.localdomain - 54198 : available
    [INFO] @localhost.localdomain - 50030 : available
    [INFO] @localhost.localdomain - 60030 : available
    [INFO] @localhost.localdomain - 50075 : available
    [INFO] @localhost.localdomain - 50010 : available
    [INFO] @localhost.localdomain - 60010 : available
    [INFO] @localhost.localdomain - 9000 : available
    [INFO] @localhost.localdomain - 9001 : available
    [INFO] @localhost.localdomain - 10080 : available
    [INFO] @localhost.localdomain - 1527 : available
    [INFO] @localhost.localdomain - 1528 : available
    [INFO] @localhost.localdomain - 8080 : available
    [INFO] @localhost.localdomain - 10000 : available
    [INFO] @localhost.localdomain - 8280 : available
    [INFO] @localhost.localdomain - 50020 : available
    [INFO] @localhost.localdomain - 60020 : available
    [INFO] @localhost.localdomain - 60000 : available
    [INFO] @localhost.localdomain - 2181 : available
    [INFO] @localhost.localdomain - 1050 : available
    [INFO] @localhost.localdomain - 8200 : available
    [INFO] @localhost.localdomain - 6882 : available
    [INFO] @localhost.localdomain - 61613 : available
    [INFO] @localhost.localdomain - 9997 : available
    [INFO] @localhost.localdomain - 9998 : available
    [INFO] @localhost.localdomain - 1099 : available
    [INFO] @localhost.localdomain - 2001 : available
    [INFO] @localhost.localdomain - 4201 : available
    [INFO] PriorChecker - Ports check - succeed
    [INFO] Progress - 3%
    [ERROR] PriorChecker - Server configuration check - failed on localhost.localdomain
    [ERROR] PriorChecker - 255 -- com.ibm.xap.mgmt.ConfigurationException: selinux configuration not compatible - SELINUX=enforcing
    at com.ibm.xap.mgmt.hdm.ServerConfigurationTask.checkSELinuxIsDisabled(ServerConfigurationTask.java:251)
    at com.ibm.xap.mgmt.hdm.ServerConfigurationTask.doTask(ServerConfigurationTask.java:134)
    at com.ibm.xap.mgmt.util.Task.run(Task.java:77)
    at com.ibm.xap.mgmt.util.TaskRunner$1.run(TaskRunner.java:52)
     
    [INFO] @localhost.localdomain - Check directory - /hadoop/hdfs/data
    [INFO] @localhost.localdomain - Check disk - /dev/sda2
    [INFO] @localhost.localdomain - Free space of /dev/sda2 is 15G
    [INFO] @localhost.localdomain - Check minimal disk space requirement for biginsights installation
    [INFO] @localhost.localdomain - Check disk usage of BIGINSIGHTS_HOME : /opt/ibm/biginsights
    [INFO] @localhost.localdomain - Free space of /dev/sda2 is 15G
    [INFO] @localhost.localdomain - Free space - 16106127360(B) > biginsights.minimal.install.size + totalfree * 0.1 - 6979321856(B) : ok
    [INFO] PriorChecker - Disk space check - succeed
    [INFO] @localhost.localdomain - Check directory - /hadoop/hdfs/data
    [INFO] @localhost.localdomain - Check disk - /dev/sda2
    [INFO] @localhost.localdomain - Free space of /dev/sda2 is 15G
    [INFO] @localhost.localdomain - Check datanode disk space requirement
    [INFO] @localhost.localdomain - Free space - 16106127360(B) > dfs.datanode.du.reserved - 1610612736(B) : ok
    [INFO] PriorChecker - Datanode disk space check - succeed
    [INFO] @localhost.localdomain - Program - scp,zip,bash,tar,ssh,unzip : installed
    [INFO] PriorChecker - Requreid software/libraries Check - succeed
    [WARN] @localhost.localdomain - skip localhost.localdomain because netcat/nc is not installed
    [WARN] @localhost.localdomain - skip localhost.localdomain because netcat/nc is not installed
    [WARN] @localhost.localdomain - skip localhost.localdomain because netcat/nc is not installed
    [INFO] PriorChecker - Internal connectivity Check - succeed
    [FATAL] System pre-check fails, some prerequisite is not fulfilled. Check log for detail.

     

    and it says localhost.localdomain  => status = error.

    This is the problem that I got :( Hope you can help me to solve this one.

     

     

    [ERROR] PriorChecker - 255 -- com.ibm.xap.mgmt.ConfigurationException: selinux configuration not compatible - SELINUX=enforcing

    Please disable SELINUX on this server, the procedure is

    edit /etc/selinux/config file, you will see content like

     

    # This file controls the state of SELinux on the system.
    # SELINUX= can take one of these three values:
    #       enforcing - SELinux security policy is enforced.
    #       permissive - SELinux prints warnings instead of enforcing.
    #       disabled - SELinux is fully disabled.
    SELINUX=enforcing  --------> replace enforcing with disabled, then reboot the machine
    # SELINUXTYPE= type of policy in use. Possible values are:
    #       targeted - Only targeted network daemons are protected.
    #       strict - Full SELinux protection.
    SELINUXTYPE=targeted
     
  • MMakati
    MMakati
    4 Posts

    Re: Problem installing BigInsight basic edition

    ‏2013-05-21T09:16:09Z  

    [ERROR] PriorChecker - 255 -- com.ibm.xap.mgmt.ConfigurationException: selinux configuration not compatible - SELINUX=enforcing

    Please disable SELINUX on this server, the procedure is

    edit /etc/selinux/config file, you will see content like

     

    # This file controls the state of SELinux on the system.
    # SELINUX= can take one of these three values:
    #       enforcing - SELinux security policy is enforced.
    #       permissive - SELinux prints warnings instead of enforcing.
    #       disabled - SELinux is fully disabled.
    SELINUX=enforcing  --------> replace enforcing with disabled, then reboot the machine
    # SELINUXTYPE= type of policy in use. Possible values are:
    #       targeted - Only targeted network daemons are protected.
    #       strict - Full SELinux protection.
    SELINUXTYPE=targeted
     

    Thanks! It works. but there is another problem.

     

    This is the log file: 

     

     

    [INFO] Launching installer back end
    [INFO] Running as root, /home/gir/biginsights-basic-linux64_b20121203_1915/installer/bin/install.sh simple-fullinstall.xml
    [INFO] Distribution Vendor : ibm
    [INFO] Progress - Initializing install properties
    [INFO] Progress - 0%
    [INFO] HadoopConf - Hadoop Configuration class is not on classpath
    [INFO] MgmtConfInitializer - Biginsights nodes [localhost.localdomain]
    [INFO] MgmtConfInitializer - install mode : install
    [INFO] MgmtConfInitializer - distro vendor : ibm
    [INFO] MgmtConfInitializer - dfs.name.dir=/hadoop/hdfs/name
    [INFO] MgmtConfInitializer - fs.checkpoint.dir=/hadoop/hdfs/namesecondary
    [INFO] MgmtConfInitializer - default dfs.data.dir=/hadoop/hdfs/data
    [INFO] MgmtConfInitializer - mapred.system.dir=/hadoop/mapred/system
    [INFO] MgmtConfInitializer - mapred.local.dir=/hadoop/mapred/local
    [INFO] MgmtConfInitializer - hadoop.log.dir=/var/ibm/biginsights/hadoop/logs
    [INFO] MgmtConfInitializer - hadoop.proxyuser.biadmin.hosts=*
    [INFO] MgmtConfInitializer - hadoop.proxyuser.biadmin.groups=*
    [INFO] MgmtConfInitializer - datanode is not set
    [INFO] MgmtConfInitializer - localhost.localdomain is NameNode
    [INFO] MgmtConfInitializer - MgmtConfInitializer: localhost.localdomain is Secondary NameNode
    [INFO] MgmtConfInitializer - localhost.localdomain is JobTracker
    [INFO] MgmtConfInitializer - localhost.localdomain is DataNode
    [INFO] MgmtConfInitializer - biginsights.home=/opt/ibm/biginsights
    [INFO] MgmtConfInitializer - biginsights.var=/var/ibm/biginsights
    [INFO] MgmtConfInitializer - mgmt.ssh.config=by_root_ssh
    [INFO] MgmtConfInitializer - mgmt.user=biadmin
    [INFO] MgmtConfInitializer - mgmt.group=biadmin
    [INFO] MgmtConfInitializer - biginsights.virtualnodes=null
    [INFO] HadoopConf - Hadoop conf saved to /home/gir/biginsights-basic-linux64_b20121203_1915/installer/hdm/hadoop-conf-staging
    [INFO] Progress - Check cluster environment
    [INFO] Progress - 2%
    [INFO] HadoopConf - Hadoop Configuration class is not on classpath
    [INFO] Deployer - scan all datanodes ... caculate free space of all attached disks
    [INFO] @localhost.localdomain - Check directory - /hadoop/hdfs/data
    [INFO] @localhost.localdomain - Check disk - /dev/sda2
    [INFO] @localhost.localdomain - Free space of /dev/sda2 is 47G
    [INFO] Deployer - (Avg dfs.data.dir capacity) * 0.1 = 5046586573
    [INFO] Deployer - reset dfs.datanode.du.reserved=5046586573
    [INFO] HadoopConf - Hadoop conf saved to /home/gir/biginsights-basic-linux64_b20121203_1915/installer/hdm/hadoop-conf-staging
    [INFO] @localhost.localdomain - Check directories succeed.
    [INFO] Progress - 2%
    [INFO] PriorChecker - Directories Check - succeed
    [INFO] @localhost.localdomain - localhost.localdomain->127.0.0.1 : valid
    [INFO] PriorChecker - Hostname/ip check - succeed
    [INFO] @localhost.localdomain - 61616 : available
    [INFO] @localhost.localdomain - 9999 : available
    [INFO] @localhost.localdomain - 8009 : available
    [INFO] @localhost.localdomain - 50090 : available
    [INFO] @localhost.localdomain - 50070 : available
    [INFO] @localhost.localdomain - 54198 : available
    [INFO] @localhost.localdomain - 50030 : available
    [INFO] @localhost.localdomain - 60030 : available
    [INFO] @localhost.localdomain - 50075 : available
    [INFO] @localhost.localdomain - 50010 : available
    [INFO] @localhost.localdomain - 60010 : available
    [INFO] @localhost.localdomain - 9000 : available
    [INFO] @localhost.localdomain - 9001 : available
    [INFO] @localhost.localdomain - 10080 : available
    [INFO] @localhost.localdomain - 1527 : available
    [INFO] @localhost.localdomain - 1528 : available
    [INFO] @localhost.localdomain - 8080 : available
    [INFO] @localhost.localdomain - 10000 : available
    [INFO] @localhost.localdomain - 8280 : available
    [INFO] @localhost.localdomain - 50020 : available
    [INFO] @localhost.localdomain - 60020 : available
    [INFO] @localhost.localdomain - 60000 : available
    [INFO] @localhost.localdomain - 2181 : available
    [INFO] @localhost.localdomain - 1050 : available
    [INFO] @localhost.localdomain - 8200 : available
    [INFO] @localhost.localdomain - 6882 : available
    [INFO] @localhost.localdomain - 61613 : available
    [INFO] @localhost.localdomain - 9997 : available
    [INFO] @localhost.localdomain - 9998 : available
    [INFO] @localhost.localdomain - 1099 : available
    [INFO] @localhost.localdomain - 2001 : available
    [INFO] @localhost.localdomain - 4201 : available
    [INFO] PriorChecker - Ports check - succeed
    [INFO] @localhost.localdomain - SELINUX - disabled : ok
    [INFO] @localhost.localdomain - OS - Red Hat Enterprise Linux Server release 6.0 (Santiago) Kernel \r on an \m : supported
    [INFO] Progress - 3%
    [INFO] PriorChecker - Server configuration check - succeed
    [INFO] @localhost.localdomain - Check directory - /hadoop/hdfs/data
    [INFO] @localhost.localdomain - Check disk - /dev/sda2
    [INFO] @localhost.localdomain - Free space of /dev/sda2 is 47G
    [INFO] @localhost.localdomain - Check minimal disk space requirement for biginsights installation
    [INFO] @localhost.localdomain - Check disk usage of BIGINSIGHTS_HOME : /opt/ibm/biginsights
    [INFO] @localhost.localdomain - Free space of /dev/sda2 is 47G
    [INFO] @localhost.localdomain - Free space - 50465865728(B) > biginsights.minimal.install.size + totalfree * 0.1 - 10415295693(B) : ok
    [INFO] PriorChecker - Disk space check - succeed
    [INFO] @localhost.localdomain - Check directory - /hadoop/hdfs/data
    [INFO] @localhost.localdomain - Check disk - /dev/sda2
    [INFO] @localhost.localdomain - Free space of /dev/sda2 is 47G
    [INFO] @localhost.localdomain - Check datanode disk space requirement
    [INFO] @localhost.localdomain - Free space - 50465865728(B) > dfs.datanode.du.reserved - 5046586573(B) : ok
    [INFO] PriorChecker - Datanode disk space check - succeed
    [INFO] @localhost.localdomain - Program - scp,zip,bash,tar,ssh,unzip : installed
    [INFO] PriorChecker - Requreid software/libraries Check - succeed
    [WARN] @localhost.localdomain - skip localhost.localdomain because netcat/nc is not installed
    [WARN] @localhost.localdomain - skip localhost.localdomain because netcat/nc is not installed
    [WARN] @localhost.localdomain - skip localhost.localdomain because netcat/nc is not installed
    [INFO] PriorChecker - Internal connectivity Check - succeed
    [INFO] Check the current user
    biadmin
    RHEL
    Added existing user biadmin to group biadmin
    [INFO] Running as biadmin, /home/biadmin/__biginsights_install/installer/bin/install.sh
    [INFO] Distribution Vendor : ibm
    [INFO] Extract Java for biadmin...
    [INFO] Check the current user
    biadmin
    [INFO] User login shell : BASH
    [INFO] Using... BIGINSIGHTS_HOME: /opt/ibm/biginsights
    [INFO] Using... BIGINSIGHTS_VAR: /var/ibm/biginsights
    [INFO] Using... SSH CONFIG MODE: by_root_ssh
    [INFO] Using... Biginsights administrator: biadmin
    [INFO] Progress - BigInsights installation response file type: install
    [INFO] HadoopConf - Hadoop Configuration class is not on classpath
    [INFO] Progress - Installing HDM
    [INFO] Progress - 3%
    [INFO] Progress - Preparing JDK package
    [INFO] Progress - 4%
    [INFO] JDK at /opt/ibm/biginsights/hdm/jdk
    [INFO] Progress - Preparing Hadoop package
    [INFO] Progress - 6%
    [INFO] Hadoop at /opt/ibm/biginsights/hdm/IHC
    [INFO] Progress - Configuring password-less SSH
    [INFO] Progress - 8%
    [INFO] HadoopMgmtCmdline - Running configAccountAndSsh /home/biadmin/__biginsights_install/installer/bin/../../artifacts
    [INFO] Cluster - Setup biginsights admin user/group, setup passwordless SSH
    [INFO] Cluster - Biadmin configured locally
    RHEL
    Added existing user biadmin to group biadmin
    Generating public/private rsa key pair.
    Your identification has been saved in /home/biadmin/.ssh/id_rsa.
    Your public key has been saved in /home/biadmin/.ssh/id_rsa.pub.
    The key fingerprint is:
    12:14:0a:f0:a5:51:11:3b:e9:b5:ae:e4:c5:c9:9b:55 biadmin@localhost.localdomain
    The key's randomart image is:
    +--[ RSA 2048]----+
    |..o.=oo.         |
    | . = =           |
    |  o = o          |
    |   . o o         |
    |    . o S E      |
    |     + o .       |
    |    . * .        |
    |   o o +         |
    |    o o          |
    +-----------------+
    SSH ID generated at /home/biadmin/.ssh/id_rsa
     
    [INFO] @localhost.localdomain - RHEL
    Added existing user biadmin to group biadmin
    SSH pub key appended to /home/biadmin/.ssh/authorized_keys
    Skip ID file generation as they exist
    [INFO] @localhost.localdomain - clean up /tmp/_root-setup-biadmin-remote.sh /tmp/id_rsa.pub
    [INFO] Progress - 11%
    [INFO] Cluster - Check biginsights admin passwordless SSH setup
    [INFO] @localhost.localdomain - OK, password-less SSH has setup.
    [INFO] Progress - 13%
    [INFO] DupHostDefender - Add other known names to  ~/.ssh/known_hosts.
    [INFO] Progress - 14%
    [INFO] Install as HADOOP_USER biadmin
    [INFO] Progress - Checking directories permission
    [INFO] Progress - 17%
    [INFO] HadoopMgmtCmdline - Running configDirs 
    [INFO] @localhost.localdomain - 
    [INFO] Progress - 18%
    [INFO] HadoopMgmtCmdline - Running check32or64 
    [INFO] Progress - Deploying IBM Hadoop Cluster
    [INFO] Progress - 18%
    [INFO] HadoopMgmtCmdline - Running deployForceAll 
    [INFO] Cluster - Repack ... /opt/ibm/biginsights/hdm/todeploy/hdm.tar.gz
    [INFO] HadoopConf - Reset includes (dfs.hosts)
    [INFO] HadoopConf - Reset includes (mapred.hosts)
    [INFO] HadoopConf - Auto set mapred.fairscheduler.allocation.file=/opt/ibm/biginsights/hadoop-conf/fair-scheduler.xml
    [INFO] HadoopConf - Hadoop conf saved to /opt/ibm/biginsights/hdm/hadoop-conf-staging
    [INFO] Cluster - Repack ... /opt/ibm/biginsights/hdm/todeploy/ihc-conf.tar.gz
    [INFO] Cluster - Repack ... /opt/ibm/biginsights/hdm/todeploy/ihc-conf.tar.gz@localhost.localdomain
    [INFO] Cluster - Repack ... /opt/ibm/biginsights/hdm/todeploy/ihc.tar.gz
    [INFO] @localhost.localdomain - Deploy ... hdm, force
    [INFO] @localhost.localdomain - Deploy ... ihc-conf@localhost.localdomain, force
    [INFO] @localhost.localdomain - Deploy ... ihc, force
    [INFO] @localhost.localdomain - Deploy ... jdk, force
    [INFO] @localhost.localdomain - Packages all up-to-date, after deploy [hdm, ihc-conf, ihc, jdk]
    [INFO] Progress - 25%
    [INFO] Progress - Clean up possible leftover process
    [INFO] Progress - 25%
    [INFO] HadoopMgmtCmdline - Running cleanupForInstall 
    [INFO] @localhost.localdomain - /opt/ibm/biginsights/.hdm-stub/bin/managed-cleanupForInstall.sh
    [INFO] Progress - Upgrading number of file handlers
    [INFO] Progress - 26%
    [INFO] HadoopMgmtCmdline - Running syncnofile 16384
    [INFO] @localhost.localdomain - /opt/ibm/biginsights/.hdm-stub/bin/managed-root-filehandler.sh biadmin 16384
    Set hard file handler num to 16384
    Set soft file handler num to 16384
    [INFO] Progress - 27%
    [INFO] Progress - Synchronizing system time
    [INFO] Progress - 27%
    [INFO] HadoopMgmtCmdline - Running synctime 
    [INFO] @localhost.localdomain - /opt/ibm/biginsights/.hdm-stub/bin/managed-root-synctime.sh 05/20/2013 08:45:05
    Mon May 20 00:00:00 PDT 2013
    Mon May 20 08:45:05 PDT 2013
    Time updated 05/20/2013 08:45:05
    [INFO] Progress - 29%
    [INFO] Progress - Installing BigInsights applications
    [INFO] Progress - 35%
    [INFO] Progress - Install hdm
    [INFO] Deployer - Copy HDM essentials to other nodes
    [INFO] Deployer - Updating environment variables
    [INFO] Deployer - export linux task controller envrionemnt var is skipped.
    [INFO] Deployer - Deploying shared lib to all nodes
    [INFO] Deployer - Create default mount directory on local file system
    [INFO] Deployer - Installing jaql-db2 integration
    [INFO] Deployer - _otherdm.sh using BIGINSIGHTS_HOME: /opt/ibm/biginsights
    _otherdm.sh using BIGINSIGHTS_VAR: /var/ibm/biginsights
    [INFO] Install JAQL-DB2 integration on the management node
    [INFO] Untar jaql_db2...
    [INFO] deploy jaql_db2 plugin succeed
     
    [INFO] Progress - 39%
    [INFO] Progress - Install zookeeper
    [INFO] @localhost.localdomain - zookeeper configuration synced
    [INFO] @localhost.localdomain - zookeeper installed
    [INFO] @localhost.localdomain - zookeeper started, pid 18429
    [INFO] Deployer - zookeeper service started
    [INFO] Progress - 43%
    [INFO] Progress - Install data-compression
    [INFO] Deployer - Inject data compression jar and natives in IHC/lib
    [INFO] HadoopConf - Hadoop conf saved to /opt/ibm/biginsights/hdm/hadoop-conf-staging
    [INFO] Deployer - Re-deploy IHC to deploy data compressor
    [INFO] Cluster - Repack ... /opt/ibm/biginsights/hdm/todeploy/hdm.tar.gz
    [INFO] HadoopConf - Hadoop conf saved to /opt/ibm/biginsights/hdm/hadoop-conf-staging
    [INFO] Cluster - Repack ... /opt/ibm/biginsights/hdm/todeploy/ihc-conf.tar.gz
    [INFO] Cluster - Repack ... /opt/ibm/biginsights/hdm/todeploy/ihc-conf.tar.gz@localhost.localdomain
    [INFO] Cluster - Repack ... /opt/ibm/biginsights/hdm/todeploy/ihc.tar.gz
    [INFO] @localhost.localdomain - Deploy ... hdm
    [INFO] @localhost.localdomain - Deploy ... ihc
    [INFO] @localhost.localdomain - Deploy ... ihc-conf@localhost.localdomain
    [INFO] @localhost.localdomain - Packages all up-to-date, after deploy [hdm, ihc, ihc-conf]
    [INFO] @localhost.localdomain - data-compression installed
    [INFO] Progress - 47%
    [INFO] Progress - Install hadoop
    [INFO] Deployer - Get memory string: MemTotal: 2047732 kB
    [INFO] Deployer - deploy hadoop to cluster
    [INFO] Cluster - Repack ... /opt/ibm/biginsights/hdm/todeploy/hdm.tar.gz
    [INFO] HadoopConf - Hadoop conf saved to /opt/ibm/biginsights/hdm/hadoop-conf-staging
    [INFO] Cluster - Repack ... /opt/ibm/biginsights/hdm/todeploy/ihc-conf.tar.gz
    [INFO] Cluster - Repack ... /opt/ibm/biginsights/hdm/todeploy/ihc-conf.tar.gz@localhost.localdomain
    [INFO] Cluster - Repack ... /opt/ibm/biginsights/hdm/todeploy/ihc.tar.gz
    [INFO] @localhost.localdomain - Deploy ... hdm, force
    [INFO] @localhost.localdomain - Deploy ... ihc-conf@localhost.localdomain, force
    [INFO] @localhost.localdomain - Deploy ... ihc, force
    [INFO] @localhost.localdomain - Deploy ... jdk, force
    [INFO] @localhost.localdomain - Packages all up-to-date, after deploy [hdm, ihc-conf, ihc, jdk]
    [INFO] @localhost.localdomain - /opt/ibm/biginsights/.hdm-stub/bin/managed-format.sh
    13/05/20 08:47:37 INFO namenode.NameNode: STARTUP_MSG: 
    /************************************************************
    STARTUP_MSG: Starting NameNode
    STARTUP_MSG:   host = localhost.localdomain/127.0.0.1
    STARTUP_MSG:   args = [-format]
    STARTUP_MSG:   version = 1.0.3
    STARTUP_MSG:   build = git://dasani.svl.ibm.com/ on branch (no branch) -r af8437f228d2c35f7445843bb5994d658c8a5446; compiled by 'jenkins' on Thu Nov 15 01:58:17 PST 2012
    ************************************************************/
    Re-format filesystem in /hadoop/hdfs/name ? (Y or N) 13/05/20 08:47:37 INFO util.GSet: VM type       = 64-bit
    13/05/20 08:47:37 INFO util.GSet: 2% max memory = 20.0 MB
    13/05/20 08:47:37 INFO util.GSet: capacity      = 2^21 = 2097152 entries
    13/05/20 08:47:37 INFO util.GSet: recommended=2097152, actual=2097152
    13/05/20 08:47:38 INFO namenode.FSNamesystem: fsOwner=biadmin
    13/05/20 08:47:38 INFO namenode.FSNamesystem: supergroup=supergroup
    13/05/20 08:47:38 INFO namenode.FSNamesystem: isPermissionEnabled=true
    13/05/20 08:47:38 INFO namenode.FSNamesystem: dfs.block.invalidate.limit=100
    13/05/20 08:47:38 INFO namenode.FSNamesystem: isAccessTokenEnabled=false accessKeyUpdateInterval=0 min(s), accessTokenLifetime=0 min(s)
    13/05/20 08:47:38 INFO namenode.FSEditLog: dfs.namenode.edits.toleration.length = -1
    13/05/20 08:47:38 INFO namenode.NameNode: Caching file names occuring more than 10 times 
    13/05/20 08:47:39 INFO common.Storage: Image file of size 113 saved in 0 seconds.
    13/05/20 08:47:40 INFO namenode.FSEditLog: closing edit log: position=4, editlog=/hadoop/hdfs/name/current/edits
    13/05/20 08:47:40 INFO namenode.FSEditLog: close success: truncate to 4, editlog=/hadoop/hdfs/name/current/edits
    13/05/20 08:47:40 INFO common.Storage: Storage directory /hadoop/hdfs/name has been successfully formatted.
    13/05/20 08:47:40 INFO namenode.NameNode: SHUTDOWN_MSG: 
    /************************************************************
    SHUTDOWN_MSG: Shutting down NameNode at localhost.localdomain/127.0.0.1
    ************************************************************/
    [INFO] Deployer - Change the permission of hadoop.tmp.dir /var/ibm/biginsights/hadoop/tmp
    [INFO] Deployer - Update hadoop-env.sh
    [INFO] Deployer - Update mapping variables in hadoop configuration based on ibm-hadoop.properties
    [INFO] Deployer - Get memory string: MemTotal: 2047732 kB
    [INFO] Cluster - Number of slave nodes : 1.0
    [INFO] Cluster - mapred.submit.replication : 1
    [INFO] Cluster - Update hadoop daemon heap size
    [INFO] Deployer - Use default task controller
    [INFO] HadoopConf - Hadoop conf saved to /opt/ibm/biginsights/hdm/hadoop-conf-staging
    [INFO] Cluster - Repack ... /opt/ibm/biginsights/hdm/todeploy/hdm.tar.gz
    [INFO] HadoopConf - Hadoop conf saved to /opt/ibm/biginsights/hdm/hadoop-conf-staging
    [INFO] Cluster - Repack ... /opt/ibm/biginsights/hdm/todeploy/ihc-conf.tar.gz
    [INFO] Cluster - Repack ... /opt/ibm/biginsights/hdm/todeploy/ihc-conf.tar.gz@localhost.localdomain
    [INFO] @localhost.localdomain - Deploy ... hdm
    [INFO] @localhost.localdomain - Deploy ... ihc-conf@localhost.localdomain
    [INFO] @localhost.localdomain - Packages all up-to-date, after deploy [hdm, ihc-conf]
    [INFO] @localhost.localdomain - Packages all up-to-date
    [INFO] @localhost.localdomain - namenode started, pid 37267
    [INFO] @localhost.localdomain - secondarynamenode started, pid 37535
    [INFO] @localhost.localdomain - datanode started, pid 37717
    [INFO] Progress - 49%
    [INFO] Deployer - Waiting for Namenode to exit safe mode...
    [INFO] Deployer - HDFS cluster started successfully
    [INFO] @localhost.localdomain - jobtracker started, pid 38088
    [INFO] @localhost.localdomain - tasktracker started, pid 38265
    [INFO] Progress - 51%
    [INFO] Deployer - MapReduce cluster started successfully
    [INFO] @localhost.localdomain - hadoop installed
    [INFO] Progress - Install derby
    [INFO] @localhost.localdomain - derby installed
    [INFO] @localhost.localdomain - derby started, pid 39176
    [INFO] Progress - 55%
    [INFO] Progress - Install jaql
    [INFO] @localhost.localdomain - jaql installed
    [INFO] Progress - 59%
    [INFO] Progress - Install hive
    [INFO] @localhost.localdomain - hive library deployed
    [INFO] @localhost.localdomain - hive installed
    [WARN] Deployer - Failed to create private credstore file hive_keystore_pwd.prop in HDFS
    [INFO] Progress - 63%
    [INFO] Progress - Install pig
    [INFO] @localhost.localdomain - pig installed
    [INFO] Progress - 68%
    [INFO] Progress - Install lucene
    [INFO] @localhost.localdomain - lucene installed
    [INFO] Progress - 72%
    [INFO] Progress - Install hbase
    [INFO] Deployer - deploying library hbase
    [INFO] Deployer - Found hase jar file : /opt/ibm/biginsights/hbase/hbase-0.94.0-security.jar
    [INFO] Deployer - Found zookeeper jar file : /opt/ibm/biginsights/hbase/lib/zookeeper-3.4.3.jar
    [INFO] Deployer - Symlink hbase.jar to overlay or BI jar files @localhost.localdomain
    [INFO] Deployer - Create symlink for lib/zookeeper.jar file to reference overlay or BI jar files @localhost.localdomain
    [INFO] @localhost.localdomain - hbase installed
    [INFO] Deployer - check zookeeper services, make sure zookeeper service it started before start hbase service
    [INFO] @localhost.localdomain - hbase-master(active) started
    [INFO] @localhost.localdomain - hbase-regionserver started
    [INFO] Deployer - hbase service started
    [INFO] Progress - 76%
    [INFO] Progress - Install flume
    [INFO] @localhost.localdomain - flume installed
    [INFO] Progress - 80%
    [INFO] Progress - Install hcatalog
    [INFO] @localhost.localdomain - Deploy ... hcatalog
    [INFO] @localhost.localdomain - hcatalog installed
    [INFO] Cluster - Repack ... /opt/ibm/biginsights/hdm/components/hcatalog/todeploy/hcatalog-conf.tar.gz
    [INFO] @localhost.localdomain - Deploy ... hcatalog-conf, force
    [INFO] @localhost.localdomain - Packages all up-to-date, after deploy [hcatalog-conf]
    [INFO] Progress - 84%
    [INFO] Progress - Install sqoop
    [INFO] @localhost.localdomain - Deploy ... sqoop
    [INFO] @localhost.localdomain - sqoop installed
    [INFO] Progress - 88%
    [INFO] Progress - Install oozie
    [INFO] Deployer - deploying library oozie
    [INFO] @localhost.localdomain - oozie installed
    [INFO] Deployer - Update biginsights-oozie.properties succeed.
    [INFO] Progress - 92%
    [INFO] Progress - Install jaqlserver
    [INFO] @localhost.localdomain - jaqlserver installed
    [INFO] Progress - 96%
    [INFO] Progress - Install console
    [INFO] Deployer - Initialize biginsights console properties, saved in /opt/ibm/biginsights/hdm/components/console/biginsights-mc.properties
    [INFO] Progress - 97%
    [INFO] Deployer - Create hadoop proxy users and groups. ( flat-file security )
    [INFO] Deployer - Ignored.
    [INFO] Deployer - Unpacking biginsights console war file.
    [INFO] Progress - 98%
    [INFO] Deployer - Updating /opt/ibm/biginsights/console/consolewartmp/WEB-INF/geronimo-web.xml
    [INFO] Progress - 99%
    [INFO] Deployer - Biginsights Enterprise Edition : [ false ] 
    [INFO] Deployer - Configure HTTPS : [ false ] 
    [INFO] Deployer - Secure type : [  ] 
    [INFO] WasceConfiguration - Configure wasce ports
    [INFO] WasceConfiguration - HTTPPort : [ 8080 ]
    [INFO] WasceConfiguration - HTTPSPort : [ 8443 ]
    [INFO] WasceConfiguration - PortOffset : [ 0 ]
    [INFO] Progress - 100%
    [INFO] Deployer - Deploying  web console into wasce server
    [INFO] Deployer - Updating /opt/ibm/biginsights/console/wasce/var/config/config.xml
    [ERROR] DeployManager - 
    [ERROR] DeployManager - OPERATION ERROR -- Install [hdm, zookeeper, data-compression, hadoop, derby, jaql, hive, pig, lucene, hbase, flume, hcatalog, sqoop, oozie, jaqlserver, console]:
    [ERROR] DeployManager - -------------------------------------------------------
    [INFO] DeployManager - hdm succeeded -- localhost.localdomain=0 (scp -r /opt/ibm/biginsights/hdm/components/shared/shared-lib localhost.localdomain:/opt/ibm/biginsights/lib)
    [INFO] DeployManager - zookeeper succeeded -- localhost.localdomain=0
    [INFO] DeployManager - data-compression succeeded -- localhost.localdomain=0
    [INFO] DeployManager - hadoop succeeded -- localhost.localdomain=0
    [INFO] DeployManager - derby succeeded -- localhost.localdomain=0
    [INFO] DeployManager - jaql succeeded -- localhost.localdomain=0
    [INFO] DeployManager - hive succeeded -- localhost.localdomain=0
    [INFO] DeployManager - pig succeeded -- localhost.localdomain=0
    [INFO] DeployManager - lucene succeeded -- localhost.localdomain=0
    [INFO] DeployManager - hbase succeeded -- localhost.localdomain=0
    [INFO] DeployManager - flume succeeded -- localhost.localdomain=0
    [INFO] DeployManager - hcatalog succeeded -- localhost.localdomain=0
    [INFO] DeployManager - sqoop succeeded -- localhost.localdomain=0
    [INFO] DeployManager - oozie succeeded -- localhost.localdomain=0
    [INFO] DeployManager - jaqlserver succeeded -- localhost.localdomain=0
    [ERROR] DeployManager - console failed
    com.ibm.xap.mgmt.DeployException: console install failed -- localhost.localdomain=-1 (com.ibm.xap.mgmt.util.TimeoutException: Timeout when executing process, timeout = 300000
    at com.ibm.xap.mgmt.util.Code.exec(Code.java:707)
    at com.ibm.xap.mgmt.util.ExecUtil.exec(ExecUtil.java:84)
    at com.ibm.xap.mgmt.util.ExecUtil.exec(ExecUtil.java:40)
    at com.ibm.xap.mgmt.console.MgmtConsoleDeployer.addNode(MgmtConsoleDeployer.java:472)
    at com.ibm.xap.mgmt.console.MgmtConsoleDeployer.access$000(MgmtConsoleDeployer.java:75)
    at com.ibm.xap.mgmt.console.MgmtConsoleDeployer$1.doTask(MgmtConsoleDeployer.java:217)
    at com.ibm.xap.mgmt.util.Task.run(Task.java:77)
    at com.ibm.xap.mgmt.util.TaskRunner$1.run(TaskRunner.java:52)
    )
    at com.ibm.xap.mgmt.console.MgmtConsoleDeployer.installConfig(MgmtConsoleDeployer.java:236)
    at com.ibm.xap.mgmt.DeployManager$InstallThread.doInstall(DeployManager.java:2380)
    at com.ibm.xap.mgmt.DeployManager$InstallThread.work(DeployManager.java:2415)
    at com.ibm.xap.mgmt.DeployManager$WorkerThread.run(DeployManager.java:2332)
    [ERROR] DeployManager - Install; SUCCEEDED components: [hdm, zookeeper, data-compression, hadoop, derby, jaql, hive, pig, lucene, hbase, flume, hcatalog, sqoop, oozie, jaqlserver]; FAILED components: [console]; Consumes : 750285ms
    Error exit.
    [INFO] Progress - Start services
    [INFO] Progress - 0%
    [INFO] Progress - Start zookeeper
    [INFO] @localhost.localdomain - zookeeper already running, pid 18429
    [INFO] Deployer - zookeeper service started
    [INFO] Progress - 6%
    [INFO] Progress - Start hadoop
    [INFO] @localhost.localdomain - namenode already running, pid 37267
    [INFO] @localhost.localdomain - secondarynamenode already running, pid 37535
    [INFO] @localhost.localdomain - datanode already running, pid 37717
    [INFO] Progress - 9%
    [INFO] Deployer - Waiting for Namenode to exit safe mode...
    [INFO] Deployer - HDFS cluster started successfully
    [INFO] @localhost.localdomain - jobtracker already running, pid 38088
    [INFO] @localhost.localdomain - tasktracker already running, pid 38265
    [INFO] Progress - 13%
    [INFO] Deployer - MapReduce cluster started successfully
    [INFO] Progress - Start derby
    [INFO] @localhost.localdomain - derby already running, pid 39176
    [INFO] Progress - 19%
    [INFO] Progress - Start hive
    [INFO] @localhost.localdomain - derby already running, pid 39176
    [INFO] @localhost.localdomain - hive-web-interface started, pid 51987
    [INFO] @localhost.localdomain - hive-server started, pid 52193
    [INFO] Progress - 25%
    [INFO] Progress - Start hbase
    [INFO] Deployer - check zookeeper services, make sure zookeeper service it started before start hbase service
    [INFO] @localhost.localdomain - hbase-master(active) already running, pid 41935
    [INFO] @localhost.localdomain - hbase-regionserver already running, pid 42028
    [INFO] Deployer - hbase service started
    [INFO] Progress - 31%
    [INFO] Progress - Start flume
    [INFO] @localhost.localdomain - flume-master started, pid 53270
    [INFO] @localhost.localdomain - flume-node started, pid 53562
    [INFO] Progress - 38%
    [INFO] Progress - Start oozie
    [INFO] @localhost.localdomain - oozie started
    [INFO] Progress - 44%
    [INFO] Progress - Start jaqlserver
    [INFO] @localhost.localdomain - jaqlserver started
    [INFO] Progress - 50%
    [INFO] DeployManager - Start; SUCCEEDED components: [zookeeper, hadoop, derby, hive, hbase, flume, oozie, jaqlserver]; Consumes : 126811ms
    [INFO] Progress - Verifying BigInsights installation
    [INFO] Progress - 50%
    [INFO] Progress - Validate hadoop
    [INFO] Deployer - Running Hadoop terasort example
    [INFO] Progress - 75%
    [INFO] Progress - Validate hbase
    [INFO] Deployer - hbase service is healthy
    [INFO] Progress - 100%
    [ERROR] DeployManager - 
    [ERROR] DeployManager - OPERATION ERROR -- Validate [hadoop, hbase]:
    [ERROR] DeployManager - -------------------------------------------------------
    [ERROR] DeployManager - hadoop failed
    java.io.IOException: exit code: 2 -- "/opt/ibm/biginsights/hdm/bin/hdm" "checkdeploy"
    [INFO] Progress - Checking Hadoop cluster started
    [INFO] HadoopMgmtCmdline - Running daemon start
    [INFO] @localhost.localdomain - namenode already running, pid 37267
    [INFO] @localhost.localdomain - secondarynamenode already running, pid 37535
    [INFO] @localhost.localdomain - datanode already running, pid 37717
    [INFO] @localhost.localdomain - jobtracker already running, pid 38088
    [INFO] @localhost.localdomain - tasktracker already running, pid 38265
    [INFO] Progress - Waiting for exit of safe mode
    [INFO] HadoopMgmtCmdline - Running safemode wait
    [INFO] Progress - Running terasort example
    >> /opt/ibm/biginsights/IHC/bin/hadoop dfs -rmr /hdm-tera-input /hdm-tera-output /hdm-tera-report
    rmr: cannot remove /hdm-tera-input: No such file or directory.
    rmr: cannot remove /hdm-tera-output: No such file or directory.
    rmr: cannot remove /hdm-tera-report: No such file or directory.
    >> /opt/ibm/biginsights/IHC/bin/hadoop jar /opt/ibm/biginsights/IHC/hadoop-examples-1.0.3.jar teragen -Dmapred.map.tasks=1 -Ddfs.block.size=536870912 -Dmapred.map.child.java.opts=-Xmx1024m -Dmapred.reduce.child.java.opts=-Xmx2048m 10000 /hdm-tera-input
    Generating 10000 using 1 maps with step of 10000
    13/05/20 09:03:11 INFO mapred.JobClient: Running job: job_201305200848_0001
    13/05/20 09:03:12 INFO mapred.JobClient:  map 0% reduce 0%
    13/05/20 09:03:46 INFO mapred.JobClient:  map 100% reduce 0%
    13/05/20 09:04:04 INFO mapred.JobClient: Job complete: job_201305200848_0001
    13/05/20 09:04:05 INFO mapred.JobClient: Counters: 19
    13/05/20 09:04:05 INFO mapred.JobClient:   Job Counters 
    13/05/20 09:04:05 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=33902
    13/05/20 09:04:05 INFO mapred.JobClient:     Launched map tasks=1
    13/05/20 09:04:05 INFO mapred.JobClient:     Total time spent by all reduces waiting after reserving slots (ms)=0
    13/05/20 09:04:05 INFO mapred.JobClient:     Total time spent by all maps waiting after reserving slots (ms)=0
    13/05/20 09:04:05 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=0
    13/05/20 09:04:05 INFO mapred.JobClient:   FileSystemCounters
    13/05/20 09:04:05 INFO mapred.JobClient:     FILE_BYTES_WRITTEN=22406
    13/05/20 09:04:05 INFO mapred.JobClient:     HDFS_BYTES_READ=81
    13/05/20 09:04:05 INFO mapred.JobClient:     HDFS_BYTES_WRITTEN=1000000
    13/05/20 09:04:05 INFO mapred.JobClient:   File Output Format Counters 
    13/05/20 09:04:05 INFO mapred.JobClient:     Bytes Written=1000000
    13/05/20 09:04:05 INFO mapred.JobClient:   Map-Reduce Framework
    13/05/20 09:04:05 INFO mapred.JobClient:     Virtual memory (bytes) snapshot=1823858688
    13/05/20 09:04:05 INFO mapred.JobClient:     Map input bytes=10000
    13/05/20 09:04:05 INFO mapred.JobClient:     Physical memory (bytes) snapshot=48128000
    13/05/20 09:04:05 INFO mapred.JobClient:     Map output records=10000
    13/05/20 09:04:05 INFO mapred.JobClient:     CPU time spent (ms)=1420
    13/05/20 09:04:05 INFO mapred.JobClient:     Map input records=10000
    13/05/20 09:04:05 INFO mapred.JobClient:     Total committed heap usage (bytes)=5905408
    13/05/20 09:04:05 INFO mapred.JobClient:     Spilled Records=0
    13/05/20 09:04:05 INFO mapred.JobClient:     SPLIT_RAW_BYTES=81
    13/05/20 09:04:05 INFO mapred.JobClient:   File Input Format Counters 
    13/05/20 09:04:05 INFO mapred.JobClient:     Bytes Read=0
    >> /opt/ibm/biginsights/IHC/bin/hadoop jar /opt/ibm/biginsights/IHC/hadoop-examples-1.0.3.jar terasort -Dmapred.reduce.tasks=0 -Ddfs.block.size=536870912 -Dmapred.map.child.java.opts=-Xmx1024m -Dmapred.reduce.child.java.opts=-Xmx2048m -Dio.sort.record.percent=0.17 /hdm-tera-input /hdm-tera-output
    13/05/20 09:04:16 INFO terasort.TeraSort: starting
    13/05/20 09:04:17 INFO mapred.FileInputFormat: Total input paths to process : 1
    13/05/20 09:04:18 INFO util.NativeCodeLoader: Loaded the native-hadoop library
    13/05/20 09:04:18 INFO zlib.ZlibFactory: Successfully loaded & initialized native-zlib library
    13/05/20 09:04:18 INFO compress.CodecPool: Got brand-new compressor
    Making 0 from 10000 records
    Step size is Infinity
    java.lang.NegativeArraySizeException
    at org.apache.hadoop.examples.terasort.TeraInputFormat$TextSampler.createPartitions(TeraInputFormat.java:92)
    at org.apache.hadoop.examples.terasort.TeraInputFormat.writePartitionFile(TeraInputFormat.java:141)
    at org.apache.hadoop.examples.terasort.TeraSort.run(TeraSort.java:243)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
    at org.apache.hadoop.examples.terasort.TeraSort.main(TeraSort.java:257)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:60)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:37)
    at java.lang.reflect.Method.invoke(Method.java:611)
    at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
    at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
    at org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:64)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:60)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:37)
    at java.lang.reflect.Method.invoke(Method.java:611)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
    >> /opt/ibm/biginsights/IHC/bin/hadoop jar /opt/ibm/biginsights/IHC/hadoop-examples-1.0.3.jar teravalidate -Dmapred.map.child.java.opts=-Xmx1024m -Dmapred.reduce.child.java.opts=-Xmx2048m /hdm-tera-output /hdm-tera-report
    13/05/20 09:04:22 INFO mapred.JobClient: Cleaning up the staging area hdfs://localhost.localdomain:9000/user/biadmin/.staging/job_201305200848_0002
    13/05/20 09:04:22 ERROR security.UserGroupInformation: PriviledgedActionException as:biadmin cause:org.apache.hadoop.mapred.InvalidInputException: Input path does not exist: hdfs://localhost.localdomain:9000/hdm-tera-output
    org.apache.hadoop.mapred.InvalidInputException: Input path does not exist: hdfs://localhost.localdomain:9000/hdm-tera-output
    at org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:197)
    at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:208)
    at org.apache.hadoop.examples.terasort.TeraInputFormat.getSplits(TeraInputFormat.java:209)
    at org.apache.hadoop.mapred.JobClient.writeOldSplits(JobClient.java:989)
    at org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:981)
    at org.apache.hadoop.mapred.JobClient.access$600(JobClient.java:174)
    at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:897)
    at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:850)
    at java.security.AccessController.doPrivileged(AccessController.java:310)
    at javax.security.auth.Subject.doAs(Subject.java:573)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1144)
    at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:850)
    at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:824)
    at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1261)
    at org.apache.hadoop.examples.terasort.TeraValidate.run(TeraValidate.java:145)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
    at org.apache.hadoop.examples.terasort.TeraValidate.main(TeraValidate.java:153)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:60)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:37)
    at java.lang.reflect.Method.invoke(Method.java:611)
    at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
    at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
    at org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:64)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:60)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:37)
    at java.lang.reflect.Method.invoke(Method.java:611)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
    >> /opt/ibm/biginsights/IHC/bin/hadoop dfs -ls /hdm-tera-report
    ls: Cannot access /hdm-tera-report: No such file or directory.
    [INFO] =============== Summary of Hadoop Installation ===============
    [INFO] TeraSort ..................................Failed
     
    at com.ibm.xap.mgmt.util.ExecUtil.exec(ExecUtil.java:96)
    at com.ibm.xap.mgmt.util.ExecUtil.exec(ExecUtil.java:40)
    at com.ibm.xap.mgmt.hdm.HadoopDeployer.healthCheck(HadoopDeployer.java:679)
    at com.ibm.xap.mgmt.DeployManager$HealthCheckThread.work(DeployManager.java:2568)
    at com.ibm.xap.mgmt.DeployManager$WorkerThread.run(DeployManager.java:2332)
    [INFO] DeployManager - hbase succeeded -- 
    [ERROR] DeployManager - Validate; SUCCEEDED components: [hbase]; FAILED components: [hadoop]; Consumes : 186621ms
    Error exit.
    [FATAL] Failed to install BigInsights component(s)

     

    Hope you can still help me. 
    Thanks in advance

  • YangWeiWei
    YangWeiWei
    72 Posts

    Re: Problem installing BigInsight basic edition

    ‏2013-05-21T10:31:11Z  
    • MMakati
    • ‏2013-05-21T09:16:09Z

    Thanks! It works. but there is another problem.

     

    This is the log file: 

     

     

    [INFO] Launching installer back end
    [INFO] Running as root, /home/gir/biginsights-basic-linux64_b20121203_1915/installer/bin/install.sh simple-fullinstall.xml
    [INFO] Distribution Vendor : ibm
    [INFO] Progress - Initializing install properties
    [INFO] Progress - 0%
    [INFO] HadoopConf - Hadoop Configuration class is not on classpath
    [INFO] MgmtConfInitializer - Biginsights nodes [localhost.localdomain]
    [INFO] MgmtConfInitializer - install mode : install
    [INFO] MgmtConfInitializer - distro vendor : ibm
    [INFO] MgmtConfInitializer - dfs.name.dir=/hadoop/hdfs/name
    [INFO] MgmtConfInitializer - fs.checkpoint.dir=/hadoop/hdfs/namesecondary
    [INFO] MgmtConfInitializer - default dfs.data.dir=/hadoop/hdfs/data
    [INFO] MgmtConfInitializer - mapred.system.dir=/hadoop/mapred/system
    [INFO] MgmtConfInitializer - mapred.local.dir=/hadoop/mapred/local
    [INFO] MgmtConfInitializer - hadoop.log.dir=/var/ibm/biginsights/hadoop/logs
    [INFO] MgmtConfInitializer - hadoop.proxyuser.biadmin.hosts=*
    [INFO] MgmtConfInitializer - hadoop.proxyuser.biadmin.groups=*
    [INFO] MgmtConfInitializer - datanode is not set
    [INFO] MgmtConfInitializer - localhost.localdomain is NameNode
    [INFO] MgmtConfInitializer - MgmtConfInitializer: localhost.localdomain is Secondary NameNode
    [INFO] MgmtConfInitializer - localhost.localdomain is JobTracker
    [INFO] MgmtConfInitializer - localhost.localdomain is DataNode
    [INFO] MgmtConfInitializer - biginsights.home=/opt/ibm/biginsights
    [INFO] MgmtConfInitializer - biginsights.var=/var/ibm/biginsights
    [INFO] MgmtConfInitializer - mgmt.ssh.config=by_root_ssh
    [INFO] MgmtConfInitializer - mgmt.user=biadmin
    [INFO] MgmtConfInitializer - mgmt.group=biadmin
    [INFO] MgmtConfInitializer - biginsights.virtualnodes=null
    [INFO] HadoopConf - Hadoop conf saved to /home/gir/biginsights-basic-linux64_b20121203_1915/installer/hdm/hadoop-conf-staging
    [INFO] Progress - Check cluster environment
    [INFO] Progress - 2%
    [INFO] HadoopConf - Hadoop Configuration class is not on classpath
    [INFO] Deployer - scan all datanodes ... caculate free space of all attached disks
    [INFO] @localhost.localdomain - Check directory - /hadoop/hdfs/data
    [INFO] @localhost.localdomain - Check disk - /dev/sda2
    [INFO] @localhost.localdomain - Free space of /dev/sda2 is 47G
    [INFO] Deployer - (Avg dfs.data.dir capacity) * 0.1 = 5046586573
    [INFO] Deployer - reset dfs.datanode.du.reserved=5046586573
    [INFO] HadoopConf - Hadoop conf saved to /home/gir/biginsights-basic-linux64_b20121203_1915/installer/hdm/hadoop-conf-staging
    [INFO] @localhost.localdomain - Check directories succeed.
    [INFO] Progress - 2%
    [INFO] PriorChecker - Directories Check - succeed
    [INFO] @localhost.localdomain - localhost.localdomain->127.0.0.1 : valid
    [INFO] PriorChecker - Hostname/ip check - succeed
    [INFO] @localhost.localdomain - 61616 : available
    [INFO] @localhost.localdomain - 9999 : available
    [INFO] @localhost.localdomain - 8009 : available
    [INFO] @localhost.localdomain - 50090 : available
    [INFO] @localhost.localdomain - 50070 : available
    [INFO] @localhost.localdomain - 54198 : available
    [INFO] @localhost.localdomain - 50030 : available
    [INFO] @localhost.localdomain - 60030 : available
    [INFO] @localhost.localdomain - 50075 : available
    [INFO] @localhost.localdomain - 50010 : available
    [INFO] @localhost.localdomain - 60010 : available
    [INFO] @localhost.localdomain - 9000 : available
    [INFO] @localhost.localdomain - 9001 : available
    [INFO] @localhost.localdomain - 10080 : available
    [INFO] @localhost.localdomain - 1527 : available
    [INFO] @localhost.localdomain - 1528 : available
    [INFO] @localhost.localdomain - 8080 : available
    [INFO] @localhost.localdomain - 10000 : available
    [INFO] @localhost.localdomain - 8280 : available
    [INFO] @localhost.localdomain - 50020 : available
    [INFO] @localhost.localdomain - 60020 : available
    [INFO] @localhost.localdomain - 60000 : available
    [INFO] @localhost.localdomain - 2181 : available
    [INFO] @localhost.localdomain - 1050 : available
    [INFO] @localhost.localdomain - 8200 : available
    [INFO] @localhost.localdomain - 6882 : available
    [INFO] @localhost.localdomain - 61613 : available
    [INFO] @localhost.localdomain - 9997 : available
    [INFO] @localhost.localdomain - 9998 : available
    [INFO] @localhost.localdomain - 1099 : available
    [INFO] @localhost.localdomain - 2001 : available
    [INFO] @localhost.localdomain - 4201 : available
    [INFO] PriorChecker - Ports check - succeed
    [INFO] @localhost.localdomain - SELINUX - disabled : ok
    [INFO] @localhost.localdomain - OS - Red Hat Enterprise Linux Server release 6.0 (Santiago) Kernel \r on an \m : supported
    [INFO] Progress - 3%
    [INFO] PriorChecker - Server configuration check - succeed
    [INFO] @localhost.localdomain - Check directory - /hadoop/hdfs/data
    [INFO] @localhost.localdomain - Check disk - /dev/sda2
    [INFO] @localhost.localdomain - Free space of /dev/sda2 is 47G
    [INFO] @localhost.localdomain - Check minimal disk space requirement for biginsights installation
    [INFO] @localhost.localdomain - Check disk usage of BIGINSIGHTS_HOME : /opt/ibm/biginsights
    [INFO] @localhost.localdomain - Free space of /dev/sda2 is 47G
    [INFO] @localhost.localdomain - Free space - 50465865728(B) > biginsights.minimal.install.size + totalfree * 0.1 - 10415295693(B) : ok
    [INFO] PriorChecker - Disk space check - succeed
    [INFO] @localhost.localdomain - Check directory - /hadoop/hdfs/data
    [INFO] @localhost.localdomain - Check disk - /dev/sda2
    [INFO] @localhost.localdomain - Free space of /dev/sda2 is 47G
    [INFO] @localhost.localdomain - Check datanode disk space requirement
    [INFO] @localhost.localdomain - Free space - 50465865728(B) > dfs.datanode.du.reserved - 5046586573(B) : ok
    [INFO] PriorChecker - Datanode disk space check - succeed
    [INFO] @localhost.localdomain - Program - scp,zip,bash,tar,ssh,unzip : installed
    [INFO] PriorChecker - Requreid software/libraries Check - succeed
    [WARN] @localhost.localdomain - skip localhost.localdomain because netcat/nc is not installed
    [WARN] @localhost.localdomain - skip localhost.localdomain because netcat/nc is not installed
    [WARN] @localhost.localdomain - skip localhost.localdomain because netcat/nc is not installed
    [INFO] PriorChecker - Internal connectivity Check - succeed
    [INFO] Check the current user
    biadmin
    RHEL
    Added existing user biadmin to group biadmin
    [INFO] Running as biadmin, /home/biadmin/__biginsights_install/installer/bin/install.sh
    [INFO] Distribution Vendor : ibm
    [INFO] Extract Java for biadmin...
    [INFO] Check the current user
    biadmin
    [INFO] User login shell : BASH
    [INFO] Using... BIGINSIGHTS_HOME: /opt/ibm/biginsights
    [INFO] Using... BIGINSIGHTS_VAR: /var/ibm/biginsights
    [INFO] Using... SSH CONFIG MODE: by_root_ssh
    [INFO] Using... Biginsights administrator: biadmin
    [INFO] Progress - BigInsights installation response file type: install
    [INFO] HadoopConf - Hadoop Configuration class is not on classpath
    [INFO] Progress - Installing HDM
    [INFO] Progress - 3%
    [INFO] Progress - Preparing JDK package
    [INFO] Progress - 4%
    [INFO] JDK at /opt/ibm/biginsights/hdm/jdk
    [INFO] Progress - Preparing Hadoop package
    [INFO] Progress - 6%
    [INFO] Hadoop at /opt/ibm/biginsights/hdm/IHC
    [INFO] Progress - Configuring password-less SSH
    [INFO] Progress - 8%
    [INFO] HadoopMgmtCmdline - Running configAccountAndSsh /home/biadmin/__biginsights_install/installer/bin/../../artifacts
    [INFO] Cluster - Setup biginsights admin user/group, setup passwordless SSH
    [INFO] Cluster - Biadmin configured locally
    RHEL
    Added existing user biadmin to group biadmin
    Generating public/private rsa key pair.
    Your identification has been saved in /home/biadmin/.ssh/id_rsa.
    Your public key has been saved in /home/biadmin/.ssh/id_rsa.pub.
    The key fingerprint is:
    12:14:0a:f0:a5:51:11:3b:e9:b5:ae:e4:c5:c9:9b:55 biadmin@localhost.localdomain
    The key's randomart image is:
    +--[ RSA 2048]----+
    |..o.=oo.         |
    | . = =           |
    |  o = o          |
    |   . o o         |
    |    . o S E      |
    |     + o .       |
    |    . * .        |
    |   o o +         |
    |    o o          |
    +-----------------+
    SSH ID generated at /home/biadmin/.ssh/id_rsa
     
    [INFO] @localhost.localdomain - RHEL
    Added existing user biadmin to group biadmin
    SSH pub key appended to /home/biadmin/.ssh/authorized_keys
    Skip ID file generation as they exist
    [INFO] @localhost.localdomain - clean up /tmp/_root-setup-biadmin-remote.sh /tmp/id_rsa.pub
    [INFO] Progress - 11%
    [INFO] Cluster - Check biginsights admin passwordless SSH setup
    [INFO] @localhost.localdomain - OK, password-less SSH has setup.
    [INFO] Progress - 13%
    [INFO] DupHostDefender - Add other known names to  ~/.ssh/known_hosts.
    [INFO] Progress - 14%
    [INFO] Install as HADOOP_USER biadmin
    [INFO] Progress - Checking directories permission
    [INFO] Progress - 17%
    [INFO] HadoopMgmtCmdline - Running configDirs 
    [INFO] @localhost.localdomain - 
    [INFO] Progress - 18%
    [INFO] HadoopMgmtCmdline - Running check32or64 
    [INFO] Progress - Deploying IBM Hadoop Cluster
    [INFO] Progress - 18%
    [INFO] HadoopMgmtCmdline - Running deployForceAll 
    [INFO] Cluster - Repack ... /opt/ibm/biginsights/hdm/todeploy/hdm.tar.gz
    [INFO] HadoopConf - Reset includes (dfs.hosts)
    [INFO] HadoopConf - Reset includes (mapred.hosts)
    [INFO] HadoopConf - Auto set mapred.fairscheduler.allocation.file=/opt/ibm/biginsights/hadoop-conf/fair-scheduler.xml
    [INFO] HadoopConf - Hadoop conf saved to /opt/ibm/biginsights/hdm/hadoop-conf-staging
    [INFO] Cluster - Repack ... /opt/ibm/biginsights/hdm/todeploy/ihc-conf.tar.gz
    [INFO] Cluster - Repack ... /opt/ibm/biginsights/hdm/todeploy/ihc-conf.tar.gz@localhost.localdomain
    [INFO] Cluster - Repack ... /opt/ibm/biginsights/hdm/todeploy/ihc.tar.gz
    [INFO] @localhost.localdomain - Deploy ... hdm, force
    [INFO] @localhost.localdomain - Deploy ... ihc-conf@localhost.localdomain, force
    [INFO] @localhost.localdomain - Deploy ... ihc, force
    [INFO] @localhost.localdomain - Deploy ... jdk, force
    [INFO] @localhost.localdomain - Packages all up-to-date, after deploy [hdm, ihc-conf, ihc, jdk]
    [INFO] Progress - 25%
    [INFO] Progress - Clean up possible leftover process
    [INFO] Progress - 25%
    [INFO] HadoopMgmtCmdline - Running cleanupForInstall 
    [INFO] @localhost.localdomain - /opt/ibm/biginsights/.hdm-stub/bin/managed-cleanupForInstall.sh
    [INFO] Progress - Upgrading number of file handlers
    [INFO] Progress - 26%
    [INFO] HadoopMgmtCmdline - Running syncnofile 16384
    [INFO] @localhost.localdomain - /opt/ibm/biginsights/.hdm-stub/bin/managed-root-filehandler.sh biadmin 16384
    Set hard file handler num to 16384
    Set soft file handler num to 16384
    [INFO] Progress - 27%
    [INFO] Progress - Synchronizing system time
    [INFO] Progress - 27%
    [INFO] HadoopMgmtCmdline - Running synctime 
    [INFO] @localhost.localdomain - /opt/ibm/biginsights/.hdm-stub/bin/managed-root-synctime.sh 05/20/2013 08:45:05
    Mon May 20 00:00:00 PDT 2013
    Mon May 20 08:45:05 PDT 2013
    Time updated 05/20/2013 08:45:05
    [INFO] Progress - 29%
    [INFO] Progress - Installing BigInsights applications
    [INFO] Progress - 35%
    [INFO] Progress - Install hdm
    [INFO] Deployer - Copy HDM essentials to other nodes
    [INFO] Deployer - Updating environment variables
    [INFO] Deployer - export linux task controller envrionemnt var is skipped.
    [INFO] Deployer - Deploying shared lib to all nodes
    [INFO] Deployer - Create default mount directory on local file system
    [INFO] Deployer - Installing jaql-db2 integration
    [INFO] Deployer - _otherdm.sh using BIGINSIGHTS_HOME: /opt/ibm/biginsights
    _otherdm.sh using BIGINSIGHTS_VAR: /var/ibm/biginsights
    [INFO] Install JAQL-DB2 integration on the management node
    [INFO] Untar jaql_db2...
    [INFO] deploy jaql_db2 plugin succeed
     
    [INFO] Progress - 39%
    [INFO] Progress - Install zookeeper
    [INFO] @localhost.localdomain - zookeeper configuration synced
    [INFO] @localhost.localdomain - zookeeper installed
    [INFO] @localhost.localdomain - zookeeper started, pid 18429
    [INFO] Deployer - zookeeper service started
    [INFO] Progress - 43%
    [INFO] Progress - Install data-compression
    [INFO] Deployer - Inject data compression jar and natives in IHC/lib
    [INFO] HadoopConf - Hadoop conf saved to /opt/ibm/biginsights/hdm/hadoop-conf-staging
    [INFO] Deployer - Re-deploy IHC to deploy data compressor
    [INFO] Cluster - Repack ... /opt/ibm/biginsights/hdm/todeploy/hdm.tar.gz
    [INFO] HadoopConf - Hadoop conf saved to /opt/ibm/biginsights/hdm/hadoop-conf-staging
    [INFO] Cluster - Repack ... /opt/ibm/biginsights/hdm/todeploy/ihc-conf.tar.gz
    [INFO] Cluster - Repack ... /opt/ibm/biginsights/hdm/todeploy/ihc-conf.tar.gz@localhost.localdomain
    [INFO] Cluster - Repack ... /opt/ibm/biginsights/hdm/todeploy/ihc.tar.gz
    [INFO] @localhost.localdomain - Deploy ... hdm
    [INFO] @localhost.localdomain - Deploy ... ihc
    [INFO] @localhost.localdomain - Deploy ... ihc-conf@localhost.localdomain
    [INFO] @localhost.localdomain - Packages all up-to-date, after deploy [hdm, ihc, ihc-conf]
    [INFO] @localhost.localdomain - data-compression installed
    [INFO] Progress - 47%
    [INFO] Progress - Install hadoop
    [INFO] Deployer - Get memory string: MemTotal: 2047732 kB
    [INFO] Deployer - deploy hadoop to cluster
    [INFO] Cluster - Repack ... /opt/ibm/biginsights/hdm/todeploy/hdm.tar.gz
    [INFO] HadoopConf - Hadoop conf saved to /opt/ibm/biginsights/hdm/hadoop-conf-staging
    [INFO] Cluster - Repack ... /opt/ibm/biginsights/hdm/todeploy/ihc-conf.tar.gz
    [INFO] Cluster - Repack ... /opt/ibm/biginsights/hdm/todeploy/ihc-conf.tar.gz@localhost.localdomain
    [INFO] Cluster - Repack ... /opt/ibm/biginsights/hdm/todeploy/ihc.tar.gz
    [INFO] @localhost.localdomain - Deploy ... hdm, force
    [INFO] @localhost.localdomain - Deploy ... ihc-conf@localhost.localdomain, force
    [INFO] @localhost.localdomain - Deploy ... ihc, force
    [INFO] @localhost.localdomain - Deploy ... jdk, force
    [INFO] @localhost.localdomain - Packages all up-to-date, after deploy [hdm, ihc-conf, ihc, jdk]
    [INFO] @localhost.localdomain - /opt/ibm/biginsights/.hdm-stub/bin/managed-format.sh
    13/05/20 08:47:37 INFO namenode.NameNode: STARTUP_MSG: 
    /************************************************************
    STARTUP_MSG: Starting NameNode
    STARTUP_MSG:   host = localhost.localdomain/127.0.0.1
    STARTUP_MSG:   args = [-format]
    STARTUP_MSG:   version = 1.0.3
    STARTUP_MSG:   build = git://dasani.svl.ibm.com/ on branch (no branch) -r af8437f228d2c35f7445843bb5994d658c8a5446; compiled by 'jenkins' on Thu Nov 15 01:58:17 PST 2012
    ************************************************************/
    Re-format filesystem in /hadoop/hdfs/name ? (Y or N) 13/05/20 08:47:37 INFO util.GSet: VM type       = 64-bit
    13/05/20 08:47:37 INFO util.GSet: 2% max memory = 20.0 MB
    13/05/20 08:47:37 INFO util.GSet: capacity      = 2^21 = 2097152 entries
    13/05/20 08:47:37 INFO util.GSet: recommended=2097152, actual=2097152
    13/05/20 08:47:38 INFO namenode.FSNamesystem: fsOwner=biadmin
    13/05/20 08:47:38 INFO namenode.FSNamesystem: supergroup=supergroup
    13/05/20 08:47:38 INFO namenode.FSNamesystem: isPermissionEnabled=true
    13/05/20 08:47:38 INFO namenode.FSNamesystem: dfs.block.invalidate.limit=100
    13/05/20 08:47:38 INFO namenode.FSNamesystem: isAccessTokenEnabled=false accessKeyUpdateInterval=0 min(s), accessTokenLifetime=0 min(s)
    13/05/20 08:47:38 INFO namenode.FSEditLog: dfs.namenode.edits.toleration.length = -1
    13/05/20 08:47:38 INFO namenode.NameNode: Caching file names occuring more than 10 times 
    13/05/20 08:47:39 INFO common.Storage: Image file of size 113 saved in 0 seconds.
    13/05/20 08:47:40 INFO namenode.FSEditLog: closing edit log: position=4, editlog=/hadoop/hdfs/name/current/edits
    13/05/20 08:47:40 INFO namenode.FSEditLog: close success: truncate to 4, editlog=/hadoop/hdfs/name/current/edits
    13/05/20 08:47:40 INFO common.Storage: Storage directory /hadoop/hdfs/name has been successfully formatted.
    13/05/20 08:47:40 INFO namenode.NameNode: SHUTDOWN_MSG: 
    /************************************************************
    SHUTDOWN_MSG: Shutting down NameNode at localhost.localdomain/127.0.0.1
    ************************************************************/
    [INFO] Deployer - Change the permission of hadoop.tmp.dir /var/ibm/biginsights/hadoop/tmp
    [INFO] Deployer - Update hadoop-env.sh
    [INFO] Deployer - Update mapping variables in hadoop configuration based on ibm-hadoop.properties
    [INFO] Deployer - Get memory string: MemTotal: 2047732 kB
    [INFO] Cluster - Number of slave nodes : 1.0
    [INFO] Cluster - mapred.submit.replication : 1
    [INFO] Cluster - Update hadoop daemon heap size
    [INFO] Deployer - Use default task controller
    [INFO] HadoopConf - Hadoop conf saved to /opt/ibm/biginsights/hdm/hadoop-conf-staging
    [INFO] Cluster - Repack ... /opt/ibm/biginsights/hdm/todeploy/hdm.tar.gz
    [INFO] HadoopConf - Hadoop conf saved to /opt/ibm/biginsights/hdm/hadoop-conf-staging
    [INFO] Cluster - Repack ... /opt/ibm/biginsights/hdm/todeploy/ihc-conf.tar.gz
    [INFO] Cluster - Repack ... /opt/ibm/biginsights/hdm/todeploy/ihc-conf.tar.gz@localhost.localdomain
    [INFO] @localhost.localdomain - Deploy ... hdm
    [INFO] @localhost.localdomain - Deploy ... ihc-conf@localhost.localdomain
    [INFO] @localhost.localdomain - Packages all up-to-date, after deploy [hdm, ihc-conf]
    [INFO] @localhost.localdomain - Packages all up-to-date
    [INFO] @localhost.localdomain - namenode started, pid 37267
    [INFO] @localhost.localdomain - secondarynamenode started, pid 37535
    [INFO] @localhost.localdomain - datanode started, pid 37717
    [INFO] Progress - 49%
    [INFO] Deployer - Waiting for Namenode to exit safe mode...
    [INFO] Deployer - HDFS cluster started successfully
    [INFO] @localhost.localdomain - jobtracker started, pid 38088
    [INFO] @localhost.localdomain - tasktracker started, pid 38265
    [INFO] Progress - 51%
    [INFO] Deployer - MapReduce cluster started successfully
    [INFO] @localhost.localdomain - hadoop installed
    [INFO] Progress - Install derby
    [INFO] @localhost.localdomain - derby installed
    [INFO] @localhost.localdomain - derby started, pid 39176
    [INFO] Progress - 55%
    [INFO] Progress - Install jaql
    [INFO] @localhost.localdomain - jaql installed
    [INFO] Progress - 59%
    [INFO] Progress - Install hive
    [INFO] @localhost.localdomain - hive library deployed
    [INFO] @localhost.localdomain - hive installed
    [WARN] Deployer - Failed to create private credstore file hive_keystore_pwd.prop in HDFS
    [INFO] Progress - 63%
    [INFO] Progress - Install pig
    [INFO] @localhost.localdomain - pig installed
    [INFO] Progress - 68%
    [INFO] Progress - Install lucene
    [INFO] @localhost.localdomain - lucene installed
    [INFO] Progress - 72%
    [INFO] Progress - Install hbase
    [INFO] Deployer - deploying library hbase
    [INFO] Deployer - Found hase jar file : /opt/ibm/biginsights/hbase/hbase-0.94.0-security.jar
    [INFO] Deployer - Found zookeeper jar file : /opt/ibm/biginsights/hbase/lib/zookeeper-3.4.3.jar
    [INFO] Deployer - Symlink hbase.jar to overlay or BI jar files @localhost.localdomain
    [INFO] Deployer - Create symlink for lib/zookeeper.jar file to reference overlay or BI jar files @localhost.localdomain
    [INFO] @localhost.localdomain - hbase installed
    [INFO] Deployer - check zookeeper services, make sure zookeeper service it started before start hbase service
    [INFO] @localhost.localdomain - hbase-master(active) started
    [INFO] @localhost.localdomain - hbase-regionserver started
    [INFO] Deployer - hbase service started
    [INFO] Progress - 76%
    [INFO] Progress - Install flume
    [INFO] @localhost.localdomain - flume installed
    [INFO] Progress - 80%
    [INFO] Progress - Install hcatalog
    [INFO] @localhost.localdomain - Deploy ... hcatalog
    [INFO] @localhost.localdomain - hcatalog installed
    [INFO] Cluster - Repack ... /opt/ibm/biginsights/hdm/components/hcatalog/todeploy/hcatalog-conf.tar.gz
    [INFO] @localhost.localdomain - Deploy ... hcatalog-conf, force
    [INFO] @localhost.localdomain - Packages all up-to-date, after deploy [hcatalog-conf]
    [INFO] Progress - 84%
    [INFO] Progress - Install sqoop
    [INFO] @localhost.localdomain - Deploy ... sqoop
    [INFO] @localhost.localdomain - sqoop installed
    [INFO] Progress - 88%
    [INFO] Progress - Install oozie
    [INFO] Deployer - deploying library oozie
    [INFO] @localhost.localdomain - oozie installed
    [INFO] Deployer - Update biginsights-oozie.properties succeed.
    [INFO] Progress - 92%
    [INFO] Progress - Install jaqlserver
    [INFO] @localhost.localdomain - jaqlserver installed
    [INFO] Progress - 96%
    [INFO] Progress - Install console
    [INFO] Deployer - Initialize biginsights console properties, saved in /opt/ibm/biginsights/hdm/components/console/biginsights-mc.properties
    [INFO] Progress - 97%
    [INFO] Deployer - Create hadoop proxy users and groups. ( flat-file security )
    [INFO] Deployer - Ignored.
    [INFO] Deployer - Unpacking biginsights console war file.
    [INFO] Progress - 98%
    [INFO] Deployer - Updating /opt/ibm/biginsights/console/consolewartmp/WEB-INF/geronimo-web.xml
    [INFO] Progress - 99%
    [INFO] Deployer - Biginsights Enterprise Edition : [ false ] 
    [INFO] Deployer - Configure HTTPS : [ false ] 
    [INFO] Deployer - Secure type : [  ] 
    [INFO] WasceConfiguration - Configure wasce ports
    [INFO] WasceConfiguration - HTTPPort : [ 8080 ]
    [INFO] WasceConfiguration - HTTPSPort : [ 8443 ]
    [INFO] WasceConfiguration - PortOffset : [ 0 ]
    [INFO] Progress - 100%
    [INFO] Deployer - Deploying  web console into wasce server
    [INFO] Deployer - Updating /opt/ibm/biginsights/console/wasce/var/config/config.xml
    [ERROR] DeployManager - 
    [ERROR] DeployManager - OPERATION ERROR -- Install [hdm, zookeeper, data-compression, hadoop, derby, jaql, hive, pig, lucene, hbase, flume, hcatalog, sqoop, oozie, jaqlserver, console]:
    [ERROR] DeployManager - -------------------------------------------------------
    [INFO] DeployManager - hdm succeeded -- localhost.localdomain=0 (scp -r /opt/ibm/biginsights/hdm/components/shared/shared-lib localhost.localdomain:/opt/ibm/biginsights/lib)
    [INFO] DeployManager - zookeeper succeeded -- localhost.localdomain=0
    [INFO] DeployManager - data-compression succeeded -- localhost.localdomain=0
    [INFO] DeployManager - hadoop succeeded -- localhost.localdomain=0
    [INFO] DeployManager - derby succeeded -- localhost.localdomain=0
    [INFO] DeployManager - jaql succeeded -- localhost.localdomain=0
    [INFO] DeployManager - hive succeeded -- localhost.localdomain=0
    [INFO] DeployManager - pig succeeded -- localhost.localdomain=0
    [INFO] DeployManager - lucene succeeded -- localhost.localdomain=0
    [INFO] DeployManager - hbase succeeded -- localhost.localdomain=0
    [INFO] DeployManager - flume succeeded -- localhost.localdomain=0
    [INFO] DeployManager - hcatalog succeeded -- localhost.localdomain=0
    [INFO] DeployManager - sqoop succeeded -- localhost.localdomain=0
    [INFO] DeployManager - oozie succeeded -- localhost.localdomain=0
    [INFO] DeployManager - jaqlserver succeeded -- localhost.localdomain=0
    [ERROR] DeployManager - console failed
    com.ibm.xap.mgmt.DeployException: console install failed -- localhost.localdomain=-1 (com.ibm.xap.mgmt.util.TimeoutException: Timeout when executing process, timeout = 300000
    at com.ibm.xap.mgmt.util.Code.exec(Code.java:707)
    at com.ibm.xap.mgmt.util.ExecUtil.exec(ExecUtil.java:84)
    at com.ibm.xap.mgmt.util.ExecUtil.exec(ExecUtil.java:40)
    at com.ibm.xap.mgmt.console.MgmtConsoleDeployer.addNode(MgmtConsoleDeployer.java:472)
    at com.ibm.xap.mgmt.console.MgmtConsoleDeployer.access$000(MgmtConsoleDeployer.java:75)
    at com.ibm.xap.mgmt.console.MgmtConsoleDeployer$1.doTask(MgmtConsoleDeployer.java:217)
    at com.ibm.xap.mgmt.util.Task.run(Task.java:77)
    at com.ibm.xap.mgmt.util.TaskRunner$1.run(TaskRunner.java:52)
    )
    at com.ibm.xap.mgmt.console.MgmtConsoleDeployer.installConfig(MgmtConsoleDeployer.java:236)
    at com.ibm.xap.mgmt.DeployManager$InstallThread.doInstall(DeployManager.java:2380)
    at com.ibm.xap.mgmt.DeployManager$InstallThread.work(DeployManager.java:2415)
    at com.ibm.xap.mgmt.DeployManager$WorkerThread.run(DeployManager.java:2332)
    [ERROR] DeployManager - Install; SUCCEEDED components: [hdm, zookeeper, data-compression, hadoop, derby, jaql, hive, pig, lucene, hbase, flume, hcatalog, sqoop, oozie, jaqlserver]; FAILED components: [console]; Consumes : 750285ms
    Error exit.
    [INFO] Progress - Start services
    [INFO] Progress - 0%
    [INFO] Progress - Start zookeeper
    [INFO] @localhost.localdomain - zookeeper already running, pid 18429
    [INFO] Deployer - zookeeper service started
    [INFO] Progress - 6%
    [INFO] Progress - Start hadoop
    [INFO] @localhost.localdomain - namenode already running, pid 37267
    [INFO] @localhost.localdomain - secondarynamenode already running, pid 37535
    [INFO] @localhost.localdomain - datanode already running, pid 37717
    [INFO] Progress - 9%
    [INFO] Deployer - Waiting for Namenode to exit safe mode...
    [INFO] Deployer - HDFS cluster started successfully
    [INFO] @localhost.localdomain - jobtracker already running, pid 38088
    [INFO] @localhost.localdomain - tasktracker already running, pid 38265
    [INFO] Progress - 13%
    [INFO] Deployer - MapReduce cluster started successfully
    [INFO] Progress - Start derby
    [INFO] @localhost.localdomain - derby already running, pid 39176
    [INFO] Progress - 19%
    [INFO] Progress - Start hive
    [INFO] @localhost.localdomain - derby already running, pid 39176
    [INFO] @localhost.localdomain - hive-web-interface started, pid 51987
    [INFO] @localhost.localdomain - hive-server started, pid 52193
    [INFO] Progress - 25%
    [INFO] Progress - Start hbase
    [INFO] Deployer - check zookeeper services, make sure zookeeper service it started before start hbase service
    [INFO] @localhost.localdomain - hbase-master(active) already running, pid 41935
    [INFO] @localhost.localdomain - hbase-regionserver already running, pid 42028
    [INFO] Deployer - hbase service started
    [INFO] Progress - 31%
    [INFO] Progress - Start flume
    [INFO] @localhost.localdomain - flume-master started, pid 53270
    [INFO] @localhost.localdomain - flume-node started, pid 53562
    [INFO] Progress - 38%
    [INFO] Progress - Start oozie
    [INFO] @localhost.localdomain - oozie started
    [INFO] Progress - 44%
    [INFO] Progress - Start jaqlserver
    [INFO] @localhost.localdomain - jaqlserver started
    [INFO] Progress - 50%
    [INFO] DeployManager - Start; SUCCEEDED components: [zookeeper, hadoop, derby, hive, hbase, flume, oozie, jaqlserver]; Consumes : 126811ms
    [INFO] Progress - Verifying BigInsights installation
    [INFO] Progress - 50%
    [INFO] Progress - Validate hadoop
    [INFO] Deployer - Running Hadoop terasort example
    [INFO] Progress - 75%
    [INFO] Progress - Validate hbase
    [INFO] Deployer - hbase service is healthy
    [INFO] Progress - 100%
    [ERROR] DeployManager - 
    [ERROR] DeployManager - OPERATION ERROR -- Validate [hadoop, hbase]:
    [ERROR] DeployManager - -------------------------------------------------------
    [ERROR] DeployManager - hadoop failed
    java.io.IOException: exit code: 2 -- "/opt/ibm/biginsights/hdm/bin/hdm" "checkdeploy"
    [INFO] Progress - Checking Hadoop cluster started
    [INFO] HadoopMgmtCmdline - Running daemon start
    [INFO] @localhost.localdomain - namenode already running, pid 37267
    [INFO] @localhost.localdomain - secondarynamenode already running, pid 37535
    [INFO] @localhost.localdomain - datanode already running, pid 37717
    [INFO] @localhost.localdomain - jobtracker already running, pid 38088
    [INFO] @localhost.localdomain - tasktracker already running, pid 38265
    [INFO] Progress - Waiting for exit of safe mode
    [INFO] HadoopMgmtCmdline - Running safemode wait
    [INFO] Progress - Running terasort example
    >> /opt/ibm/biginsights/IHC/bin/hadoop dfs -rmr /hdm-tera-input /hdm-tera-output /hdm-tera-report
    rmr: cannot remove /hdm-tera-input: No such file or directory.
    rmr: cannot remove /hdm-tera-output: No such file or directory.
    rmr: cannot remove /hdm-tera-report: No such file or directory.
    >> /opt/ibm/biginsights/IHC/bin/hadoop jar /opt/ibm/biginsights/IHC/hadoop-examples-1.0.3.jar teragen -Dmapred.map.tasks=1 -Ddfs.block.size=536870912 -Dmapred.map.child.java.opts=-Xmx1024m -Dmapred.reduce.child.java.opts=-Xmx2048m 10000 /hdm-tera-input
    Generating 10000 using 1 maps with step of 10000
    13/05/20 09:03:11 INFO mapred.JobClient: Running job: job_201305200848_0001
    13/05/20 09:03:12 INFO mapred.JobClient:  map 0% reduce 0%
    13/05/20 09:03:46 INFO mapred.JobClient:  map 100% reduce 0%
    13/05/20 09:04:04 INFO mapred.JobClient: Job complete: job_201305200848_0001
    13/05/20 09:04:05 INFO mapred.JobClient: Counters: 19
    13/05/20 09:04:05 INFO mapred.JobClient:   Job Counters 
    13/05/20 09:04:05 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=33902
    13/05/20 09:04:05 INFO mapred.JobClient:     Launched map tasks=1
    13/05/20 09:04:05 INFO mapred.JobClient:     Total time spent by all reduces waiting after reserving slots (ms)=0
    13/05/20 09:04:05 INFO mapred.JobClient:     Total time spent by all maps waiting after reserving slots (ms)=0
    13/05/20 09:04:05 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=0
    13/05/20 09:04:05 INFO mapred.JobClient:   FileSystemCounters
    13/05/20 09:04:05 INFO mapred.JobClient:     FILE_BYTES_WRITTEN=22406
    13/05/20 09:04:05 INFO mapred.JobClient:     HDFS_BYTES_READ=81
    13/05/20 09:04:05 INFO mapred.JobClient:     HDFS_BYTES_WRITTEN=1000000
    13/05/20 09:04:05 INFO mapred.JobClient:   File Output Format Counters 
    13/05/20 09:04:05 INFO mapred.JobClient:     Bytes Written=1000000
    13/05/20 09:04:05 INFO mapred.JobClient:   Map-Reduce Framework
    13/05/20 09:04:05 INFO mapred.JobClient:     Virtual memory (bytes) snapshot=1823858688
    13/05/20 09:04:05 INFO mapred.JobClient:     Map input bytes=10000
    13/05/20 09:04:05 INFO mapred.JobClient:     Physical memory (bytes) snapshot=48128000
    13/05/20 09:04:05 INFO mapred.JobClient:     Map output records=10000
    13/05/20 09:04:05 INFO mapred.JobClient:     CPU time spent (ms)=1420
    13/05/20 09:04:05 INFO mapred.JobClient:     Map input records=10000
    13/05/20 09:04:05 INFO mapred.JobClient:     Total committed heap usage (bytes)=5905408
    13/05/20 09:04:05 INFO mapred.JobClient:     Spilled Records=0
    13/05/20 09:04:05 INFO mapred.JobClient:     SPLIT_RAW_BYTES=81
    13/05/20 09:04:05 INFO mapred.JobClient:   File Input Format Counters 
    13/05/20 09:04:05 INFO mapred.JobClient:     Bytes Read=0
    >> /opt/ibm/biginsights/IHC/bin/hadoop jar /opt/ibm/biginsights/IHC/hadoop-examples-1.0.3.jar terasort -Dmapred.reduce.tasks=0 -Ddfs.block.size=536870912 -Dmapred.map.child.java.opts=-Xmx1024m -Dmapred.reduce.child.java.opts=-Xmx2048m -Dio.sort.record.percent=0.17 /hdm-tera-input /hdm-tera-output
    13/05/20 09:04:16 INFO terasort.TeraSort: starting
    13/05/20 09:04:17 INFO mapred.FileInputFormat: Total input paths to process : 1
    13/05/20 09:04:18 INFO util.NativeCodeLoader: Loaded the native-hadoop library
    13/05/20 09:04:18 INFO zlib.ZlibFactory: Successfully loaded & initialized native-zlib library
    13/05/20 09:04:18 INFO compress.CodecPool: Got brand-new compressor
    Making 0 from 10000 records
    Step size is Infinity
    java.lang.NegativeArraySizeException
    at org.apache.hadoop.examples.terasort.TeraInputFormat$TextSampler.createPartitions(TeraInputFormat.java:92)
    at org.apache.hadoop.examples.terasort.TeraInputFormat.writePartitionFile(TeraInputFormat.java:141)
    at org.apache.hadoop.examples.terasort.TeraSort.run(TeraSort.java:243)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
    at org.apache.hadoop.examples.terasort.TeraSort.main(TeraSort.java:257)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:60)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:37)
    at java.lang.reflect.Method.invoke(Method.java:611)
    at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
    at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
    at org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:64)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:60)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:37)
    at java.lang.reflect.Method.invoke(Method.java:611)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
    >> /opt/ibm/biginsights/IHC/bin/hadoop jar /opt/ibm/biginsights/IHC/hadoop-examples-1.0.3.jar teravalidate -Dmapred.map.child.java.opts=-Xmx1024m -Dmapred.reduce.child.java.opts=-Xmx2048m /hdm-tera-output /hdm-tera-report
    13/05/20 09:04:22 INFO mapred.JobClient: Cleaning up the staging area hdfs://localhost.localdomain:9000/user/biadmin/.staging/job_201305200848_0002
    13/05/20 09:04:22 ERROR security.UserGroupInformation: PriviledgedActionException as:biadmin cause:org.apache.hadoop.mapred.InvalidInputException: Input path does not exist: hdfs://localhost.localdomain:9000/hdm-tera-output
    org.apache.hadoop.mapred.InvalidInputException: Input path does not exist: hdfs://localhost.localdomain:9000/hdm-tera-output
    at org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:197)
    at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:208)
    at org.apache.hadoop.examples.terasort.TeraInputFormat.getSplits(TeraInputFormat.java:209)
    at org.apache.hadoop.mapred.JobClient.writeOldSplits(JobClient.java:989)
    at org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:981)
    at org.apache.hadoop.mapred.JobClient.access$600(JobClient.java:174)
    at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:897)
    at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:850)
    at java.security.AccessController.doPrivileged(AccessController.java:310)
    at javax.security.auth.Subject.doAs(Subject.java:573)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1144)
    at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:850)
    at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:824)
    at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1261)
    at org.apache.hadoop.examples.terasort.TeraValidate.run(TeraValidate.java:145)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
    at org.apache.hadoop.examples.terasort.TeraValidate.main(TeraValidate.java:153)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:60)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:37)
    at java.lang.reflect.Method.invoke(Method.java:611)
    at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
    at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
    at org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:64)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:60)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:37)
    at java.lang.reflect.Method.invoke(Method.java:611)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
    >> /opt/ibm/biginsights/IHC/bin/hadoop dfs -ls /hdm-tera-report
    ls: Cannot access /hdm-tera-report: No such file or directory.
    [INFO] =============== Summary of Hadoop Installation ===============
    [INFO] TeraSort ..................................Failed
     
    at com.ibm.xap.mgmt.util.ExecUtil.exec(ExecUtil.java:96)
    at com.ibm.xap.mgmt.util.ExecUtil.exec(ExecUtil.java:40)
    at com.ibm.xap.mgmt.hdm.HadoopDeployer.healthCheck(HadoopDeployer.java:679)
    at com.ibm.xap.mgmt.DeployManager$HealthCheckThread.work(DeployManager.java:2568)
    at com.ibm.xap.mgmt.DeployManager$WorkerThread.run(DeployManager.java:2332)
    [INFO] DeployManager - hbase succeeded -- 
    [ERROR] DeployManager - Validate; SUCCEEDED components: [hbase]; FAILED components: [hadoop]; Consumes : 186621ms
    Error exit.
    [FATAL] Failed to install BigInsights component(s)

     

    Hope you can still help me. 
    Thanks in advance

    How much memory this machine has ? Could it be running out of memory that causes wasce could not be started ?

    The second issue on hadoop verification looks like a bug

     

    /opt/ibm/biginsights/IHC/bin/hadoop jar /opt/ibm/biginsights/IHC/hadoop-examples-1.0.3.jar terasort -Dmapred.reduce.tasks=0 -Ddfs.block.size=536870912 -Dmapred.map.child.java.opts=-Xmx1024m -Dmapred.reduce.child.java.opts=-Xmx2048m -Dio.sort.record.percent=0.17 /hdm-tera-input /hdm-tera-output

    Looks like the number of reduce task is set to zero wrongly, the formula used to calculate the reduce task number is based on 

    Average number of cores * number of tasktrackers * 0.5 * 0.9

    if you run with single server with single core, that equals to 1 * 1 * 0.5 * 0.9  = 0.45 = 0, so I say it is a bug. Anyway, you can ignore this error and keep using hadoop, it should be functional. If you really want it work, you can edit $BIGINSIGHTS_HOME/hdm/bin/hdm-terasort.sh, set 

    numReduces=1

    before any reference it has

  • ZhangXu
    ZhangXu
    1 Post

    Re: Problem installing BigInsight basic edition

    ‏2013-05-21T11:04:34Z  
    I'm trying to RHEL6 and faced the same problem:

    Extracting Java ....
    Java extraction complete, using JAVA_HOME=/root/biginsights-basic-linux64_b20121203_1915/_jvm/
    Verifying port 8300 availability
    port 8300 available
    Starting BigInsights Installer ......tail: cannot open `/root/biginsights-basic-linux64_b20121203_1915/installer-console/var/log/server.out' for reading: No such file or directory

    Please, advice

    i have same problem,had u resolve thisk problem,plz give the answer to my email:neworld_zx@163.com;neworld.zx@gmail.com

    thks

  • 6V3M_sab_chandra
    6V3M_sab_chandra
    1 Post

    Re: Problem installing BigInsight basic edition

    ‏2014-01-03T22:51:00Z  
    I'm trying to RHEL6 and faced the same problem:

    Extracting Java ....
    Java extraction complete, using JAVA_HOME=/root/biginsights-basic-linux64_b20121203_1915/_jvm/
    Verifying port 8300 availability
    port 8300 available
    Starting BigInsights Installer ......tail: cannot open `/root/biginsights-basic-linux64_b20121203_1915/installer-console/var/log/server.out' for reading: No such file or directory

    Please, advice

    hey ,

    I am also facing the same issue.

    I am trying to install biginsights on a centos (64 bit) RHEL virtual box.

    After I execute the ./start.sh all i get this below :

    [root@localhost biginsights]# ./start.sh
    artifacts/ibm-java-sdk-6.0-12.0-linux-ppc64.tgz artifacts/ibm-java-sdk-6.0-12.0-linux-x86_64.tgz
    Extracting Java ....
    /home/centos/Public/biginsights
    Java extraction complete, using JAVA_HOME=/home/centos/Public/biginsights/_jvm/
    Verifying port 8300 availability
    port 8300 available
    Starting BigInsights Installer appserver is  WASCE
    ......tail: cannot open `/home/centos/Public/biginsights/installer-console/var/log/server.out' for reading: No such file or directory
    .tail: cannot open `/home/centos/Public/biginsights/installer-console/var/log/server.out' for reading: No such file or directory
    .tail: cannot open `/home/centos/Public/biginsights/installer-console/var/log/server.out' for reading: No such file or directory
    .tail: cannot open `/home/centos/Public/biginsights/installer-console/var/log/server.out' for reading: No such file or directory
     

     

    shabbu