As a newbie to the Batch Processing API (JSR-352), I have some difficulties modeling the following (simplified) scenario:
- Suppose we have a
Batchlet
that produces a dynamic set of files in a firststep
. - In a second
step
, all these files must be processed individually inchunk
s (viaItemReader
,ItemProcessor
andItemWriter
) resulting in a new set of files. - In a third
step
these new files need to be packaged in one large archive.
I couldn't find a way to define the second step because the specification doesn't seem to provide a loop construct (and in my understanding partition
, split
and flow
only work for a set with a known fixed size).
How could a job xml definition look like? Do I have to give up on the idea of chunking in the second step or do I have to divide the task into multiple jobs? Is there another option?