So I have a bunch of files that contain block data to restore a certain area. The restore happens over 3-5 minutes so the server doesn't lag to death(1.4m-4.5m blocks). However the issue is these files are so big when they are read to restore an area the memory usage increases by at least 300MB and that memory is never collected by the garbage collector. I thought about reading the files when the server starts up but then the server is using 2gb ram right away. That is an issue right now since the server only has 3GB to use but will be upgraded to 8GB in a week. I do call a System.gc() when the restore is done but it doesn't do any help. Any ideas on how to force the files out of memory or something? Well I have no idea what I did but it doesn't seem to need as much memory anymore but it still seems to never get garbage collected.