This is document 1 of a 3 part series: Diagnosing JVM Memory problems with the Maximo or TPAE application The application fails or slows down due to the lack of memory and an increase in JVM garbage collection processes.
Lack of memory generally associated with Java coding and failure to release objects
All JVM releases
Resolving The Problem
Understanding Java Memory Management
Java memory is primarily controlled by two parameters passed to the JVM when it is started.
Xms is the amount of contiguous memory immediately allocated to the JVM when it is started. If this value is 512m then the JVM will startup and immediately allocate 512 megabytes of memory whether it needs it or not.
Xmx is the amount of memory the JVM is allowed to grow to. If this value is 1024m, the maximum amount of memory made available to the application running inside the JVM is 1 gigabyte (1024 megabytes).
In the example of minimum memory set of 512 MB and maximum memory set to 1 GB, Java code will run in the original allocated memory until it nears the top of the memory at which time it will allocate more until the maximum has been reached. Once Java has allocated memory to the JVM, it is generally not returned to the operating system even if the application requirement shrinks (Some JVM releases do return this memory now). Instead, it continues to run the application inside the new upper limit until a new allocation is required.
Many more parameters can be passed to control debug logging; activity frequency, and other functions but these two basic parameters (Xms and Xmx) determine the memory space available to the application.
A Java application is designed to use variables and objects (memory) and then release them when the process at hand is complete. Even though the application releases these objects, the Java JVM will not have access to the memory until a “Garbage Collection” process takes place. Java does this automatically on a timed or memory requirement basis. A typical application will show the memory availability shrink over a period of time and then become available when the garbage collection process runs. The memory graph below shows memory usage growing over time as a process uses objects and then a major garbage collection process executing that frees all unused objects.
The garbage collection process is a very processor intensive process. When monitoring processor utilization while a Java program is executing it is typical to see relatively high spikes when the process runs, however; processor utilization spikes are very short in duration with most typically taking less than a second. Users do not generally notice these spikes because they occur on the server. Users are typically on a client computer and these spikes are so short in duration they are undetectable.
Each service in Maximo uses memory in the form of variables and objects. The most notable of these is the Maximo Business Object (MBO). These MBOs are loaded into sets, or groups of related objects. Each process that runs may create one or more sets of MBOs (MboSet).
When a Java program does not properly release MBOs, the number of MBOs and perhaps the number of mbosets begin to grow in memory. Since the application has not officially released the objects, a garbage collection process cannot free the memory.
As the memory in use approaches the maximum available memory, the garbage collection process runs more often and when it reaches the ceiling, the garbage collection begins to run perpetually. Ultimately, all available memory is in use and the processor is 100% utilized. As JVMs approach this condition, users experience worse and worse performance until the application stops responding or the JVM finally abnormally shuts down.
17 June 2018