2

I need to migrate some data from one table to another table applying some processing in between. Processing is proprietary and is exposed as REST service. So I need to read records from a table call REST services and then write processed record in another table. How do I implement this as Spring Batch so that I can make REST calls in parallel for multiple records because that's where I am expecting a lot of time for each record? ItemProcessor where I am planning to make the REST call only accept a single Item in process method.

aschipfl
  • 33,626
  • 12
  • 54
  • 99
ppatidar
  • 177
  • 2
  • 10

2 Answers2

1

When the processing is the bottleneck, you can use:

There are some important considerations you would want to pay attention to when choosing a scalability option, they are detailed here: https://stackoverflow.com/a/20342308/5019386

Mahmoud Ben Hassine
  • 28,519
  • 3
  • 32
  • 50
  • I aggree with @Mahmoud, i follow his hint and thats what i got: https://medium.com/@eddybayonne1/remote-chunking-with-spring-batch-integration-63c9df75e361 – Pascoal Eddy Bayonne Jan 04 '23 at 10:44
0

I had the same requirements and as @Mahmoud said: There are some good approaches:

The Remote chunking:

Processing and writing are done by the worker, while reading is done by the master. The master has to read and then pass actual records to the worker via the wire.

Remote Chunking in Practice

In case of Async Processing which will allow you also to scale your batch:

Tutorial Spring Batch Integration using Async Processing