I've got a table of over 1 million customers. Every customer's information gets updated often but will only be updated once a day. I've got a Spring batch job which
- reads a customer from customer table (JdbcCursorItemReader)
- processes the customer information (ItemProcessor)
- writes to the customer table (ItemWriter)
I want to run 10 jobs at once which will read from one Customer table without reading a customer twice. Is this possible with Spring batch or is this something that I will have to handle at the database level using crawlLog table as mentioned in this post ?
I know that parameters can be passed to the job. I can read all the customer ids and distribute the customer ids to the 10 jobs evenly. But would this be right way of doing it?