I am running a parallel matlab code using 12 processors in a centralised cluster (linux) of my institute. The code runs fine however it is unable to print the excel files. I am using xlwrite function and have all relevant folders and files in the working directory. The error that I get is
``Error using org.apache.poi.xssf.usermodel.XSSFCell/setCellValue Java exception occurred: java.lang.OutOfMemoryError: GC overhead limit exceeded
How can i change the heap memory?
Each excel file that is generated (expected) has some 45000 rows and six columns and contains integers as well as float data. (I am using the xlwrite function four times in the entire code) The input data variable that is used in the code is of some 22MB size. The same piece of code ran perfectly on a Windows machine having 32GB RAM. However I have lost access to it and hence need to know how to run it on a linux machine. The cmd file to submit the job in server is:
#!/bin/bash
#@ output = test.out
#@ error = test.err
#@ job_type = MPICH
#@ node = 1
#@ tasks_per_node = 16
#@ class = Medium128
#@ environment = COPY_ALL
#@ queue
Jobid=`echo $LOADL_STEP_ID | cut -f 6 -d .`
tmpdir=$HOME/scratch/job$Jobid
mkdir -p $tmpdir; cd $tmpdir
cp -R $LOADL_STEP_INITDIR/* $tmpdir
matlab < filterDataVal.m
mv ../job$Jobid $LOADL_STEP_INITDIR