i have a stage in Datastage in which i am creating a JVM and using some java methods.
When i have less number of records the osh script ran perfectly. But the no.of records are increased i am getting Operator terminated abnormally: Terminating with exception:APT_BadAlloc: Heap allocation failed.
so i increased the heapsize of the JVM at the time of creation, but the error persists. Could any one help me out.
Thanks in advance.
This topic has been locked.
2 replies Latest Post - 2011-03-08T04:33:20Z by SystemAdmin
Pinned topic APT_BadAlloc: Heap allocation failed.
Answered question This question has been answered.
Unanswered question This question has not been answered yet.
Updated on 2011-03-08T04:33:20Z at 2011-03-08T04:33:20Z by SystemAdmin
RobertDickson 0600009JMM20 PostsACCEPTED ANSWER
Re: APT_BadAlloc: Heap allocation failed.2011-03-07T13:15:06Z in response to SystemAdminHi,
Does https://www-304.ibm.com/support/docview.wss?uid=swg21411997 help?
SystemAdmin 110000D4XK533 PostsACCEPTED ANSWER
Re: APT_BadAlloc: Heap allocation failed.2011-03-08T04:33:20Z in response to RobertDicksonHi thank you for reply. Actually in that stage implementation when processing the records, i used named pipe for records transfer, then it is working fine irrespective of no.of records. Now i have changed that implementation in such a way that the records are passed as a java objects using JNI concept(in the code i created a JVM and given min heap and max heap size ) now its stuck with the BadAlloc error. Main cause of error is i changed the implementation. I want my implementation to be worked with out any errors. And one more thing is i am new to Data Stage. So suggest some tips to get rid of this.