1

I am loading multiple tables (20 tables with each having 4-6 Million of rows) from oracle to Ignite as tables using the JDBC client driver, took reference from https://ignite.apache.org/docs/latest/SQL/JDBC/jdbc-driver .

// Register JDBC driver.
Class.forName("org.apache.ignite.IgniteJdbcThinDriver");

// Open the JDBC connection.
Connection conn = DriverManager.getConnection("jdbc:ignite:thin://127.0.0.1");

steps

I am creating connection object and statement within try with resource and

  1. create table
  2. batch insert rows
  3. create indexes.

I have my spring boot app running with 52G and before running this process it uses under 2G of heap space. After starting and completing this data load to ignite, the heap reaches 45G or more and after couple of hours it ramps down to under 10G. Any idea how to resolve this memory consumption issue ?

I am clueless on how to load the data and keep the memory cleared after each load.

Update: just to add more info, I am working on a big calculation engine. After all the processing I am trying to push the output to ignite and will be querying the ignite and using it to other child node calculations. this whole process went out of memory, so I tried to minimize the usecase. Just by splitting the data load by using oracle table instead of engine output.

  • What are you doing with this data? Without seeing more code it's hard to say what the issue is with any certainty, but it sounds like you are keeping a reference to everything, so it sticks around in memory. You should be able to grab/insert several thousand rows at a time and process them as needed then discard them (ensure that there are no more references), if you work in chunks like this then you'll be able to stick around the 2g memory mark with ease. – sorifiend Jul 15 '22 at 03:18
  • Also, how are you measuring the memory usage? If you are using something like the window task manager then note that it may not be in use memory, rather the JVM often hangs onto previously allocated memory long after it is no longer required. Maybe take a look here for some more info: [Does GC release back memory to OS?](https://stackoverflow.com/questions/30458195/does-gc-release-back-memory-to-os) – sorifiend Jul 15 '22 at 03:22
  • @sorifiend have updated the question with more details. and also regarding the memory stats, I was able to check the metrics displayed in the application log every interval. – vijay elango Jul 15 '22 at 09:50
  • What happens if you just reduce the heap size to, say, 15Gb? If it's not running short of heap space, the JVM might just decide not to waste time garbage collecting at that point. Does your application actually _need_ 45+Gb of memory? – Stephen Darlington Jul 15 '22 at 12:32
  • @StephenDarlington since its a calculation engine and we do multithreaded processing of multiple formulas, we had the config, so that high data volume processing will not go out of memory. and also this is the first time we are using this huge volume of data in the app. – vijay elango Jul 15 '22 at 13:17

0 Answers0