Topic
  • 1 reply
  • Latest Post - ‏2013-10-02T00:01:22Z by DonaldN
MatL
MatL
14 Posts

Pinned topic CQDesign "Out of memory"

‏2013-10-01T18:21:33Z |

Has anyone experienced seeing CQDesign crash with an "Out of Memory" dialog?

This is happening regularly on two different systems, typically when selecting the "Properties" context menu for a control.

Once this error is seen, the control will loose its reference to a field, and subsequent changes are impossible. Most recently I had to kill the cqdesign process because of repeated errors when trying to exit CQDesign - "CMMD1006E This operation cannot proceed until the metadata has been loaded." 

Our schema is a bit complex, so I would appreciate if there is a timeout value I can change to avoid crashing CQDesign while it's parsing/loading the schema.

I don't see any logs in our Rational installation folder to help with this.

Environment:

CQ 7.1.2.6

Test server - Windows 7 Enteriprise, 64-bit

Windows Server 2008 R2, 64-bit

Updated on 2013-10-01T19:10:56Z at 2013-10-01T19:10:56Z by MatL
  • DonaldN
    DonaldN
    279 Posts

    Re: CQDesign "Out of memory"

    ‏2013-10-02T00:01:22Z  

    The ClearQuest Designer is based on Eclipse and uses much more memory than the old-school Windows variant. If you consistently get OutOfMemory issue with it, switch to the old one (cqdesign.exe) as a temporary relief.

    There are other things that you can try to keep the memory consumption in control.

    1. Load only one schema version each time.

    2. Do as few tasks as possible before restarting ClearQuest Designer for a new session.

    3. Try to increase the JVM heap size for the ClearQuest Designer. You need to change a configuration file found in the ClearQuest Designer location. But I think this is not recommended, unless you really know what you're doing.

    Keep in mind that the ClearQuest Designer is a 32 bit application and has an absolute limit of 2GB memory on Windows. If the schema is too complicated, it is possible that the application just cannot cope with it. You can open a ticket with IBM Support to check if this is the case. Sometimes it is the quirks in the schema that make the memory consumption go wild. You never know until you debug it.