I just implemented Pentaho in my company and set its memory with 12GB. When we try to load a 16 million rows from one table to another, it goes out of memory.
I thought Pentaho would clear the memory when performing the commit on the database, but seemingly it doesn't happen. This exception is thrown when it loads around row 2.5 million, which means to load 16 million I'd need a 73Gb RAM machine? (a rough math, of course)
Is there any parameter or configuration to make the magic happen? This memory issue is limiting our loading capacity (16 million is only one of the tables ). Can't believe Pentaho will braise the memory until it bursts without clearing the cache eventually.
My file D:\Pentaho\server\biserver-ee\tomcat\bin\service.bat has the following line:
"%EXECUTABLE%" //US//%SERVICE_NAME% ++JvmOptions "-Djava.io.tmpdir=%CATALINA_BASE%\temp;
-Djava.util.logging.manager=org.apache.juli.ClassLoaderLogManager;
-Djava.util.logging.config.file=%CATALINA_BASE%\conf\logging.properties;
-XX:MaxPermSize=256m" --JvmMs 2048 --JvmMx 12288
Does it have anything to do with the line below?
-XX:MaxPermSize=256m
Could someone explain me what exactly it is?
Thanks in advance!
PS: This is my first contact with Pentaho, so, I am sorry for any unnecessary question or assumption.