Block Restore Memory issue

Discussion in 'Plugin Development' started by rmb938, Sep 24, 2012.

Thread Status:
Not open for further replies.
  1. Offline

    rmb938

    So I have a bunch of files that contain block data to restore a certain area. The restore happens over 3-5 minutes so the server doesn't lag to death(1.4m-4.5m blocks). However the issue is these files are so big when they are read to restore an area the memory usage increases by at least 300MB and that memory is never collected by the garbage collector.

    I thought about reading the files when the server starts up but then the server is using 2gb ram right away. That is an issue right now since the server only has 3GB to use but will be upgraded to 8GB in a week.

    I do call a System.gc() when the restore is done but it doesn't do any help. Any ideas on how to force the files out of memory or something?

    Well I have no idea what I did but it doesn't seem to need as much memory anymore but it still seems to never get garbage collected.
     
  2. Offline

    gregthegeek

    Are you calling close() on the files? You can also try setting your unneeded variables to null.
     
  3. Offline

    rmb938

    Yup and I found the issue. Apparently because I was using a MappedByteBuffer to read the files it caused a memory leak. Java does not like when you use a lot of them sadly. So I am currently searching for another quick way to read files using NIO and convert a byteArray to an object.
     
Thread Status:
Not open for further replies.

Share This Page