Topic
14 replies Latest Post - ‏2013-02-14T16:46:13Z by armink
KJB123
KJB123
26 Posts
ACCEPTED ANSWER

Pinned topic Out of Memory! Segmentation fault(coredump) from nmon2web.pl after AIX6.1

‏2010-04-15T18:45:44Z |
I just upgraded our nmon-to-web server from AIX5.3 to AIX6.1. Most other lpars are still running AIX 5.3.
Ever since the upgrade, the nmon2web.pl script has been failing with an "Out of Memory!" error, followed by a "Segmentation fault(coredump)". The core dump is in the config directory of the html_out for the offending lpar. Data for most lpars seems to process fine. But there are a few that consistently receive this error.
Any ideas on what is causing this and how to correct it?
Thank you.
Updated on 2013-02-14T16:46:13Z at 2013-02-14T16:46:13Z by armink
  • KJB123
    KJB123
    26 Posts
    ACCEPTED ANSWER

    Re: Out of Memory! Segmentation fault(coredump) from nmon2web.pl after AIX6.1

    ‏2010-04-20T15:35:15Z  in response to KJB123
    Through trial-and-error, I've found that there are only 2 of the 40+ lpars that cause this Out of Memory/Segmentation fault. If I remove those two lpars from the list, nmon2web.pl seems to work fine - it processes the nmon data collected from all the other lpars, and creates the html_out data so that it can be viewed with my browser. But I still don't know what it is about those 2 particular lpars that causes the nmon2web.pl script to fail (I don't know perl, so that doesn't help matters).
    Anyone have any ideas?
    • BruceSpencer
      BruceSpencer
      297 Posts
      ACCEPTED ANSWER

      Re: Out of Memory! Segmentation fault(coredump) from nmon2web.pl after AIX6.1

      ‏2010-04-20T16:16:31Z  in response to KJB123
      I haven't seen this problem before. Post one of the nmon.csv files that is causing the problem, and I'll take a look at it.
      • KJB123
        KJB123
        26 Posts
        ACCEPTED ANSWER

        Re: Out of Memory! Segmentation fault(coredump) from nmon2web.pl after AIX6.1

        ‏2010-04-27T12:22:04Z  in response to BruceSpencer
        Thanks for the reply, Bruce.
        Turns out I needed to set the LDR_CNTRL environment variable prior to calling the nmon-to-web script, so that it can use the "large address-space model" of more than 256mb. Evidently the AIX 6.1 upgrade pushed the memory footprint over the 256mb default for nmon-to-web processing of two of our lpars.

        http://publib.boulder.ibm.com/infocenter/pseries/v5r3/index.jsp?topic=/com.ibm.aix.genprogc/doc/genprogc/lrg_prg_support.htm
        • BruceSpencer
          BruceSpencer
          297 Posts
          ACCEPTED ANSWER

          Re: Out of Memory! Segmentation fault(coredump) from nmon2web.pl after AIX6.1

          ‏2010-04-27T15:42:48Z  in response to KJB123
          Glad you were able to fix the problem.

          I'm curious why only 2 of the partitions had this problem. What is the sample rate? And how many months did you specify for the rrdtool databases?
  • MumtazShaikh
    MumtazShaikh
    2 Posts
    ACCEPTED ANSWER

    Re: Out of Memory! Segmentation fault(coredump) from nmon2web.pl after AIX6.1

    ‏2010-11-26T06:27:37Z  in response to KJB123
    Hi,

    When I executed nmon2web.pl file.After executing 1 file it gives error.

    Out of memory!.

    Please help

    I have to process 150 server's nmon file for 1 & 1/2 months in 2 days time.
    • BruceSpencer
      BruceSpencer
      297 Posts
      ACCEPTED ANSWER

      Re: Out of Memory! Segmentation fault(coredump) from nmon2web.pl after AIX6.1

      ‏2010-11-26T23:44:41Z  in response to MumtazShaikh
      Attach the file that is causing the problem, and I'll try to duplicate it on my server.

      Are you trying to run all 150 files at the same time? If so, can you try running couple one at a time? Any improvement?
      • SystemAdmin
        SystemAdmin
        2402 Posts
        ACCEPTED ANSWER

        Re: Out of Memory! Segmentation fault(coredump) from nmon2web.pl after AIX6.1

        ‏2012-12-30T07:36:37Z  in response to BruceSpencer
        I have the same problem for only 2 server. Even i have tried exporting LDR_CNTRL parameter and running.

        export LDR_CNTRL=MAXDATA=0x80000000 perl;/nmondata/tools/nmon2web.pl & >> /tmp/nmon2web_tmp.log

        However removing the 2 server it is running fine. Not sure what is the issue.
        • armink
          armink
          30 Posts
          ACCEPTED ANSWER

          Re: Out of Memory! Segmentation fault(coredump) from nmon2web.pl after AIX6.1

          ‏2013-02-04T10:16:34Z  in response to SystemAdmin
          Does your command line really look like this? Remove the ";" after perl and try again!

          Regarding this particular perl problem: I raised a PMR for that and IBM refused to accept and solve it.
          The reason: perl is only supported "as-is"... and we're not big enough to make IBM change it's mind.

          To make this even worse: The problem is also present in AIX 7.1 with perl 5.10.1

          Feel free to ask IBM about this too... maybe they solve it if enough feedback is available.
        • SystemAdmin
          SystemAdmin
          2402 Posts
          ACCEPTED ANSWER

          Re: Out of Memory! Segmentation fault(coredump) from nmon2web.pl after AIX6.1

          ‏2013-02-05T08:29:32Z  in response to SystemAdmin
          Have you tried with this GNU perl ?

          http://www.perzl.org/aix/index.php?n=Main.Perl
          • armink
            armink
            30 Posts
            ACCEPTED ANSWER

            Re: Out of Memory! Segmentation fault(coredump) from nmon2web.pl after AIX6.1

            ‏2013-02-05T10:49:18Z  in response to SystemAdmin
            Nope, but thanks for pointing this out! I only download packages from perzl.org when really necessary. There is always an avalanche of packet dependencies... sometimes they even break installed software (like the openssl package).
            When there is no other way, I'd download perl from there... but before I'd think about compiling it myself :-)
            • SystemAdmin
              SystemAdmin
              2402 Posts
              ACCEPTED ANSWER

              Re: Out of Memory! Segmentation fault(coredump) from nmon2web.pl after AIX6.1

              ‏2013-02-05T20:19:55Z  in response to armink
              What about trying it out in the wpar?
              • armink
                armink
                30 Posts
                ACCEPTED ANSWER

                Re: Out of Memory! Segmentation fault(coredump) from nmon2web.pl after AIX6.1

                ‏2013-02-07T14:43:57Z  in response to SystemAdmin
                I even have the luxury of a test LPAR... but this doesn't solve the problem.
                I want IBM to provide a working perl installation instead of dealing with dependencies and issues of unsupported 3rd party packages.
                • SystemAdmin
                  SystemAdmin
                  2402 Posts
                  ACCEPTED ANSWER

                  Re: Out of Memory! Segmentation fault(coredump) from nmon2web.pl after AIX6.1

                  ‏2013-02-07T20:52:20Z  in response to armink
                  LOL. Not too much asked...what about asking nicely;)

                  Dear Support,
                  Our community of Aix heavy users have a problem with the perl, it does not
                  Work as before but core dumps due memory issues. AME...
                  Etc, etc
                  • armink
                    armink
                    30 Posts
                    ACCEPTED ANSWER

                    Re: Out of Memory! Segmentation fault(coredump) from nmon2web.pl after AIX6.1

                    ‏2013-02-14T16:46:13Z  in response to SystemAdmin
                    I think it doesn't make that much sense when I'm asking IBM more often.
                    I opened a PMR, asked nicely and the PMR was closed.

                    So, dear AIX community reader, if you read this and have an AIX support contract,
                    please open a PMR for a fix of the described perl memory problem!
                    Maybe if one of the big AIX customers complains, we can profit from that.

                    Thank you in advance! :-)