A fix is available
APAR status
Closed as program error.
Error description
Trying to stop WebSphere Application Server gracefully, an application server trace will show various components being quiesced, like this: [01.10.14 11:19:25:746 MESZ] 00000051 MDBListenerMa > quiesce (com.ibm.ejs.jms.listener.MDBListenerManagerImpl) [5093acc2] Entry However the clean shutdown does not progress and eventually the application server terminates. Although there isn't any indication of OutOfMemory errors occuring in the FFDC before the app server terminates, there are instances of that occuring in the trace during the quiesce. A typical thread stack looks as follows: ======== Caused by: com.ibm.msg.client.commonservices.CSIException: JMSCS0006: An internal problem occurred. Diagnostic information for service was written to 'null'. Please terminate the application as the product is in an inconsistent internal state. at com.ibm.msg.client.commonservices.trace.Trace.ffst(Trace.java:15 40) at com.ibm.msg.client.commonservices.trace.Trace.catchBlockInternal (Trace.java:1207) at com.ibm.msg.client.commonservices.trace.Trace.catchBlock(Trace.j ava:1161) at com.ibm.msg.client.commonservices.trace.Trace.catchBlockInternal (Trace.java:1202) at com.ibm.msg.client.commonservices.trace.Trace.catchBlock(Trace.j ava:1161) at com.ibm.msg.client.commonservices.trace.Trace.catchBlockInternal (Trace.java:1202) at com.ibm.msg.client.commonservices.trace.Trace.catchBlock(Trace.j ava:1161) at com.ibm.msg.client.commonservices.trace.Trace.catchBlockInternal (Trace.java:1202) at com.ibm.msg.client.commonservices.trace.Trace.catchBlock(Trace.j ava:1161) at com.ibm.msg.client.commonservices.trace.Trace.catchBlockInternal (Trace.java:1202) at com.ibm.msg.client.commonservices.trace.Trace.catchBlock(Trace.j ava:1161) at com.ibm.msg.client.commonservices.trace.Trace.catchBlockInternal (Trace.java:1202) at com.ibm.msg.client.commonservices.trace.Trace.catchBlock(Trace.j ava:1161) at com.ibm.msg.client.commonservices.trace.Trace.catchBlockInternal (Trace.java:1202) at com.ibm.msg.client.commonservices.trace.Trace.catchBlock(Trace.j ava:1161) at com.ibm.msg.client.commonservices.trace.Trace.catchBlockInternal (Trace.java:1202) at com.ibm.msg.client.commonservices.trace.Trace.catchBlock(Trace.j ava:1161) at com.ibm.msg.client.commonservices.trace.Trace.catchBlockInternal (Trace.java:1202) at com.ibm.msg.client.commonservices.trace.Trace.catchBlock(Trace.j ava:1161) at com.ibm.msg.client.commonservices.trace.Trace.catchBlockInternal (Trace.java:1202) at com.ibm.msg.client.commonservices.trace.Trace.catchBlock(Trace.j ava:1161) at com.ibm.msg.client.commonservices.trace.Trace.catchBlockInternal (Trace.java:1202) at com.ibm.msg.client.commonservices.trace.Trace.catchBlock(Trace.j ava:1161) at com.ibm.msg.client.commonservices.trace.Trace.catchBlockInternal (Trace.java:1202) at com.ibm.msg.client.commonservices.trace.Trace.catchBlock(Trace.j ava:1161 ) at com.ibm.msg.client.commonservices.trace.Trace.finallyBlockIntern al(Trace .java:1392) at com.ibm.msg.client.commonservices.trace.Trace.finallyBlock(Trace .java:13 33) at com.ibm.msg.client.wmq.common.internal.WMQTraceHandler.finallyBl ock(WMQTraceHandler.java:754) at com.ibm.mq.jmqi.remote.internal.system.RemoteConnection.sendTSH( RemoteConnection.java:2883) at com.ibm.mq.jmqi.remote.internal.system.RemoteConnection.sendHear tbeat(RemoteConnection.java:3766) at com.ibm.mq.jmqi.remote.internal.RemoteTCPConnection.receive(Remo teTCPConnection.java:1554) at com.ibm.mq.jmqi.remote.internal.RemoteRcvThread.receiveBuffer(Re moteRcvThread.java:804) at com.ibm.mq.jmqi.remote.internal.RemoteRcvThread.receiveOneTSH(Re moteRcvThread.java:768) at com.ibm.mq.jmqi.remote.internal.RemoteRcvThread.run(RemoteRcvThr ead.java :158) ... 5 more Caused by: java.lang.OutOfMemoryError: GC overhead limit exceeded ======== The above stack contains the following pair of entries: com.ibm.msg.client.commonservices.trace.Trace.catchBlockInternal (Trace.java:1202) at com.ibm.msg.client.commonservices.trace.Trace.catchBlock(Trace.j ava:1161) ... multiple times, which indicates that the WebSphere MQ Resource Adapter has gone into a loop which eventually causes the application server process to run out of memory.
Local fix
Problem summary
**************************************************************** USERS AFFECTED: This issue affects two categories of users: 1) Users of: - The WebSphere MQ V7.0.1 classes for JMS. - The WebSphere MQ V7.0.1 OSGi bundles. - The WebSphere MQ V7.1 classes for JMS. - The WebSphere MQ V7.1 OSGi bundles. - The WebSphere MQ V7.5 classes for JMS. - The WebSphere MQ V7.5 OSGi bundles. - The WebSphere MQ V8 classes for Java. - The WebSphere MQ V8 classes for JMS. - The WebSphere MQ V8 OSGi bundles. who enable WebSphere MQ classes for JMS tracing. 2) Users of: - The WebSphere Application Server V7 WebSphere MQ messaging provider. - The WebSphere Application Server V8.0 WebSphere MQ messaging provider. - The WebSphere Application Server V8.5 WebSphere MQ messaging provider. who enable WebSphere Application Server trace using a trace string that contains the following entries: JMSApi=all:JMSServer=all:Messaging=all:JMS_WASTraceAdapter=all:c om.ibm.mq.*=all:jmsApi=all Platforms affected: MultiPlatform **************************************************************** PROBLEM DESCRIPTION: If the internal trace routines used by the WebSphere MQ classes for JMS, and the WebSphere MQ V8 classes for Java, encountered a java.lang.Throwable error while trying to write some trace data, they would: - Attempt to write out a trace entry containing information about the java.lang.Throwable. - Generate an FDC. However, if the java.lang.Throwable was generated for a serious error (such as an OutOfMemoryError), then the attempt to write out the trace entry containing information about the java.lang.Throwable would also fail with the same error. This would cause the WebSphere MQ classes for JMS or classes for Java to try to write out a new trace entry for the java.lang.Throwable for the error, which would fail and cause the WebSphere MQ classes for JMS or classes for Java to try and write a new trace entry for the java.lang.Throwable, and so on. This behaviour caused the WebSphere MQ classes for JMS and WebSphere MQ V8 classes for Java to enter an infinite loop and eventually cause the Java Runtime Environment that they were running in to fail with the error: java.lang.OutOfMemoryError: GC overhead limit exceeded
Problem conclusion
The internal routines used by the WebSphere MQ classes for JMS and the WebSphere MQ V8 classes for Java have been updated so that, if a java.lang.Throwable occurs while writing out trace data, they no longer write out a new trace entry containing details of the error. --------------------------------------------------------------- The fix is targeted for delivery in the following PTFs: Version Maintenance Level v7.0 7.0.1.13 v7.1 7.1.0.7 v7.5 7.5.0.5 v8.0 8.0.0.2 The latest available maintenance can be obtained from 'WebSphere MQ Recommended Fixes' http://www-1.ibm.com/support/docview.wss?rs=171&uid=swg27006037 If the maintenance level is not yet available information on its planned availability can be found in 'WebSphere MQ Planned Maintenance Release Dates' http://www-1.ibm.com/support/docview.wss?rs=171&uid=swg27006309 ---------------------------------------------------------------
Temporary fix
Comments
APAR Information
APAR number
IV65990
Reported component name
WMQ SOL SPARC
Reported component ID
5724H7223
Reported release
701
Status
CLOSED PER
PE
NoPE
HIPER
NoHIPER
Special Attention
NoSpecatt
Submitted date
2014-10-20
Closed date
2014-11-28
Last modified date
2014-11-28
APAR is sysrouted FROM one or more of the following:
APAR is sysrouted TO one or more of the following:
Fix information
Fixed component name
WMQ SOL SPARC
Fixed component ID
5724H7223
Applicable component levels
R701 PSY
UP
[{"Business Unit":{"code":"BU053","label":"Cloud & Data Platform"},"Product":{"code":"SSFKSJ","label":"WebSphere MQ"},"Platform":[{"code":"PF025","label":"Platform Independent"}],"Version":"7.0.1","Line of Business":{"code":"LOB45","label":"Automation"}}]
Document Information
Modified date:
01 October 2021