0

We have a java application which reads all the records from table and forms a json document.finally all the json document is pushed to elastic search. this application takes more than 3 hours to complete as there are more than 1 million records. we tried to apply multithreading but the performance was not up to the mark. Hence we wanted to go for spring batch chunk processing.

But how to implement the same in spring batch. We cannot write pojo class for each table.We just iterate the resultset and form a json object. All the examples available uses pojo class

Steps involved.

  1. reads the data from database
  2. iterate the resultset and form a json object.
  3. push all the json object to Elasticsearch.
James Z
  • 12,209
  • 10
  • 24
  • 44
Shala
  • 31
  • 4

1 Answers1

0

First consideration:

Do you already thought in return your DB Entity itself? So you won't need to do a for iterator to convert manually all data records. Using a mapper like Jackson or Gson it's really easy.

Second consideration:

Do you already thought in use a No-SQL databse, like MongoDB, to store this records? It has a expressive advantage in performance comparing to relational databases.

Third consideration:

You can try to use Spring Batch, as you already mentioned. But at this point I don't know to say if it's more performant than other solutions. Read this topic, it may help

  • The json document we are building has a complex structure . We cannot directly use json path for our use case.based on some condition's ,we have a specifc json format which we build using json simple library – Shala Jan 01 '23 at 14:08
  • More over all the spring batch example available are implemented using pojo class.i want to do this without a pojo class – Shala Jan 01 '23 at 14:11