problem: when write big file in Linux Sles platform, memory is occupied by large
ammount and even when the operation complete, the memory resource still
can not be released.
public class Test
public static void main(String args)
////big file path
String oldPath = "/home/db2inst1/frdmon_bk/FRDMON.0.db2inst1.NODE0000.CATN0000.20071029120106.001_sdasdasd";
String newPath = "/tmp/_COPIED_";
int byteread = 0;
BufferedInputStream bufIn = new BufferedInputStream(new FileInputStream(oldPath));
BufferedOutputStream bufOut = new BufferedOutputStream(new FileOutputStream(newPath));
byte buffer = new byte1024 * 64;
while ((byteread = bufIn.read(buffer)) != -1)
////[?]each loop will eat memory by size of the buffer
bufOut.write(buffer, 0, byteread);
////[?]memory occupied by size of the big file, and can not be released.
////[?]if delete the new file, resource will be released.
//File newfile = new File(newPath);
/** run in Linux, cmd:
* /opt/IBM/WebSphere/AppServer/java/bin/java /tmp/Test.class
* After set JVM heap size16M~32M of websphere application and run in websphere,
* it seems that no such problem. But we have to set the heap size with value0~512M for other programe.
* And run in windows also has no such problem, the size of memory used is about the same as the
* buffer size during running and release when finish.
catch (Exception e)
This topic has been locked.
1 reply Latest Post - 2008-05-29T12:20:25Z by SystemAdmin
Pinned topic Java read/write big file issue in Linux
Answered question This question has been answered.
Unanswered question This question has not been answered yet.
Updated on 2008-05-29T12:20:25Z at 2008-05-29T12:20:25Z by SystemAdmin
SystemAdmin 110000D4XK235 PostsACCEPTED ANSWER
Re: Java read/write big file issue in Linux2008-05-29T12:20:25Z in response to SystemAdminBufferedInput and BufferedOutput reduces physical read and writes by storing data in memory. If this is causing you a problem, try w/o Buffered objects.
Also try setting the objects to null when you are done with them.