@jeff-cook There are a few things that you can use together to achieve what you want. I have not tried these but you can certainly try and let stackoverflow know if it worked. Though, I came across this very late, but if you still have this requirement, this might help.
- To read multiple tables, you can use CompositeItemReader. You can add, one reader per table into the composite reader with each reader having its own query. Each of these reader would have a row mapper that would return specific model like Employee and Department. Then in the DefaultUnifyingStringItemsMapper of the composite reader, you can combine the above two records into one using a parent model say EmployeeDepartment and save above two records into that. At this point you have created the Employee and Department relationship as you want.
Note: Because of the way spring batch works, this will NOT give a list but each individual row would get combined into a relationship. By default, this will get passed to processor where you can get further details per row. When all the items of a chunk have been processed, that's when you would have a list available in the writer. The writer can then write it the way you want.
Just like in point 1. above, you can actually also fetch the details from mongo with its own row mapper. Then combine all three into one relationship in the DefaultUnifyingStringItemsMapper. This can then be written as you want after being processed.
In both approaches above, you will need to create and assign different DataSource objects for each type of DB that you want to read from and assign to respective reader. Note that if you are specifically assigning a transaction manager to the step, the datasource passed to it should be the same instance that you passed to writer else you will see data integrity issues in case of rollbacks. This is how spring batch works, it has nothing to do with above approach.
If you need to validate the three tables data, you can use ValidatingItemProcessor. If a row fails your validation, you can throw validation exception. Now if you want the entire processing to fail when this exception is thrown, you can leave it at that. If you want only the failing item to be skipped and continue processing rest of the items, you will need to set a skip policy by calling faultTolerant().skipPolicy(new AlwaysSkipItemSkipPolicy()) or whichever policy is suitable for you when creating the step.
If you must have a list before you can actually fetch anything from MongoDB, you may try creating a List in jobContext then go on adding items to it after each read and access the list in processor. Then operate on this list as you wish. Though I doubt this will actually work since reader and processor are called one item at a time. The output of processor is then formed into a list and then passed to writer as a chunk. if this does not work for you, you may try reading the mongo in beforeWrite function of ItemWriterListener but that is not the right place to do all this.
I believe your work should be done if you follow first four steps.