I am loading multiple tables (20 tables with each having 4-6 Million of rows) from oracle to Ignite as tables using the JDBC client driver, took reference from https://ignite.apache.org/docs/latest/SQL/JDBC/jdbc-driver .
// Register JDBC driver.
Class.forName("org.apache.ignite.IgniteJdbcThinDriver");
// Open the JDBC connection.
Connection conn = DriverManager.getConnection("jdbc:ignite:thin://127.0.0.1");
steps
I am creating connection object and statement within try with resource and
- create table
- batch insert rows
- create indexes.
I have my spring boot app running with 52G and before running this process it uses under 2G of heap space. After starting and completing this data load to ignite, the heap reaches 45G or more and after couple of hours it ramps down to under 10G. Any idea how to resolve this memory consumption issue ?
I am clueless on how to load the data and keep the memory cleared after each load.
Update: just to add more info, I am working on a big calculation engine. After all the processing I am trying to push the output to ignite and will be querying the ignite and using it to other child node calculations. this whole process went out of memory, so I tried to minimize the usecase. Just by splitting the data load by using oracle table instead of engine output.