2

I'm programming in order to create a tool to translate a coded .txt file in a readable .xlsx file. (I need to work with xlsx because I'm working with more than 256 colums, so i'm using XSSFWorkbook when writing rows and colums).

The coding part is ok.. I know it beacuse I verified it several times but when I tried to add new coding maps in the code it all ends in :

Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
    at java.util.Arrays.copyOf(Unknown Source)
    at java.io.ByteArrayOutputStream.write(Unknown Source)
    at org.apache.poi.openxml4j.opc.internal.MemoryPackagePartOutputStream.write(MemoryPackagePart OutputStream.java:88)
at org.apache.xmlbeans.impl.store.Cursor._save(Cursor.java:590)
at org.apache.xmlbeans.impl.store.Cursor.save(Cursor.java:2544)
at org.apache.xmlbeans.impl.values.XmlObjectBase.save(XmlObjectBase.java:212)
at org.apache.poi.xssf.usermodel.XSSFSheet.write(XSSFSheet.java:2480)
at org.apache.poi.xssf.usermodel.XSSFSheet.commit(XSSFSheet.java:2439)
at org.apache.poi.POIXMLDocumentPart.onSave(POIXMLDocumentPart.java:196)
at org.apache.poi.POIXMLDocumentPart.onSave(POIXMLDocumentPart.java:200)
at org.apache.poi.POIXMLDocument.write(POIXMLDocument.java:204)
at model.Conversione.traduzioneFile(Conversione.java:241)
at model.Main.scriviFile(Main.java:76)
at model.Main.main(Main.java:52)

The error occurs after 3000/4000 lines , the memory is acting like this start value:

14443720 **:13572128 **:12078128 **:10575592 **:14126224 ->new increase always lower than start value then decreasing **:12559920 **:11811440 **:10229128 **:13751400 -> ... **:13011080

The "coding maps" are maps generally of the type HashMap<Integer,Hashmap<Integer,String>>. (I did it like this because I can not use databases..)

So the procedure is mainly:

-instantiate and create the whole map at first, -read line from .txt file, -split the line, take a token, translate it using the map and put it into rows and columns -wb.write(fileOutputStream fos) -fos.close.

I cannot understand why that error occurs even if the maps I added are not taken into account in the translating operation...

And why the memory quantity is so variable? (but never back to start value?)

If i wasn't clear in some points please ask.. i don't know what to do..

At the beginning I thought it was a buffering problem for the increasing size of .xlsx file (even if..as i said..nothing should have changed because the new maps are not used)..

Any hints appreciated..

Lucia

Lucia Belardinelli
  • 727
  • 1
  • 11
  • 20

5 Answers5

2

You can inspect what is taking up memory using jvisualvm or jmap -histo:live. If the application really needs memory you can try increasing the memory limits -Xmx ...

Ashwinee K Jha
  • 9,187
  • 2
  • 25
  • 19
  • I suppose I can't increase memory because I still have to add a LOT of map so i'd come to the same point.. now I'm trying to understand how to use jvisualvm . Thanks a lot – Lucia Belardinelli Dec 06 '11 at 15:54
1

If you face this issue in eclipse while running a standalone java application. Right click your java progrram click "run as -> run configurations" . click argument tab and type -Xms1024M -Xmx1024M in VM arguments field. This increases VM size while running from eclispe.

Muthu
  • 1,550
  • 10
  • 36
  • 62
1

I advise you to use SXSSFWorkbook (requires Apache POI >= 3.8 beta 3) instead of XSSFWorkbook as it's advised here.

final SXSSFWorkbook workbook = new SXSSFWorkbook(20);

creates a workbook whose row access window size is 20, i.e only at most 20 rows are kept in memory, the rest is flushed on disk.

There are some pitfalls to be aware of when doing so, avoid putting null values into your cells and call SXSSFSheet.trackAllColumnsForAutoSizing() (requires Apache POI >= 3.15) as soon as possible if you need to call SXSSFSheet.autoSizeColumn(int) later to automatically compute the sizes of the columns.

gouessej
  • 3,640
  • 3
  • 33
  • 67
0

By default, Eclipse will allocate up to 384 megabytes of Java heap memory. This should be ample for all typical development tasks. However, depending on the JRE that you are running, the number of additional plug-ins you are using, and the number of files you will be working with, you could conceivably have to increase this amount. Eclipse allows you to pass arguments directly to the Java VM using the -vmargs command line argument, which must follow all other Eclipse specific arguments. Thus, to increase the available heap memory, you would typically use:

eclipse -vmargs -Xmx<memory size>

with the value set to greater than "384M" (384 megabytes -- the default).

When using an Oracle (Sun) VM, you may also need to increase the size of the permanent generation memory. The default maximum is 64 megabytes, but more may be needed depending on your plug-in configuration and use. When the VM runs out of permanent generation memory, it may crash or hang during class loading. This failure is less common when using Sun JRE version 1.5.0_07 or greater. The maximum permanent generation size is increased using the -XX:MaxPermSize= argument:

**eclipse -vmargs -XX:MaxPermSize=<memory size>**
Noufal Panolan
  • 1,357
  • 2
  • 14
  • 27
  • This has nothing to do with eclipse's memory. This has to do with the VM settings that eclipse adds when it runs his tests and programs. You get that from the Run Configurations menu and the Arguments tab. – Gray Dec 06 '11 at 15:56
0

This has nothing to do with buffering. You are running out of memory because you have too many objects in your heap and/or your -Xmx settings for max memory are too low. You can increase the -Xmx in the "Run Configurations" in Eclipse in the Arguments tab corresponding to your test/utility. You add -Xmx 512M (or so) in the VM Arguments block.

But the real problem is that you are trying to store too many objects in memory at the same time. I'm not sure I understand your HashMap of HashMaps. If there is a HashMap per line in your text file then that is going to take up a lot of memory. @AKJ's answer gives some tools to use to diagnose which objects are taking up the most space. VisualVM is a good place to start.

Gray
  • 115,027
  • 24
  • 293
  • 354