I am having an external api, one to download and other to upload data from and to db tables [postgresql]. Table is quite big. Over time, we observed that slowly server keeps on taking memory but doesn't remove it much, and it throws error and quits [sometimes linux closes it]. I checked the memory dump and I cannot figure out anything which I can relate to with my code. I don't use any local caching or something like that. But today I got this-
java.lang.OutOfMemoryError: GC overhead limit exceeded
at java.sql.Timestamp.toString(Timestamp.java:350)
at java.lang.String.valueOf(String.java:2994)
at org.jooq.impl.AbstractParam.name(AbstractParam.java:107)
at org.jooq.impl.AbstractParam.<init>(AbstractParam.java:81)
at org.jooq.impl.AbstractParam.<init>(AbstractParam.java:77)
at org.jooq.impl.Val.<init>(Val.java:63)
at org.jooq.impl.DSL.val(DSL.java:15157)
at org.jooq.impl.Tools.field(Tools.java:1092)
at org.jooq.impl.Tools.fields(Tools.java:1226)
at org.jooq.impl.BatchSingle.executePrepared(BatchSingle.java:231)
at org.jooq.impl.BatchSingle.execute(BatchSingle.java:182)
at org.jooq.impl.BatchCRUD.executePrepared(BatchCRUD.java:159)
at org.jooq.impl.BatchCRUD.execute(BatchCRUD.java:100)
For fetching, I use normal fetch
function and for dumping data into DB, I use JOOQ's batchInsert
and batchUpdate
methods. Is there any good practises with JOOQ to deal with large set of data? Am I missing something?