7

I have a requirement to get the data from a database and write that data to files based on the filename given in the database.

This is how data is defined in the database:

Columns --> FILE_NAME, REC_ID, NAME
 Data --> file_1.csv, 1, ABC
 Data --> file_1.csv, 2, BCD
 Data --> file_1.csv, 3, DEF
 Data --> file_2.csv, 4, FGH
 Data --> file_2.csv, 5, DEF
 Data --> file_3.csv, 6, FGH
 Data --> file_3.csv, 7, DEF
 Data --> file_4.csv, 8, FGH

As you see, basically the file names along with the data is defined in the Database so what SpringBatch should do is get this data and write it to the corresponding file specified in the Database (i.e., file_1.csv should only contain 3 records (1,2,3), file_2.csv should only contain records 4 and 5, etc.)

Is it possible to use MultiResourceItemWriter for this requirement (please note that entire file name is dynamic and needs to be retrieved from Database).

Gilles 'SO- stop being evil'
  • 104,111
  • 38
  • 209
  • 254
forumuser1
  • 71
  • 1
  • 1
  • 2

2 Answers2

7

I'm not sure but I don't think there is an easy way of obtaining this. You could try to build your own ItemWriter like this:

public class DynamicItemWriter  implements ItemStream, ItemWriter<YourEntry> {

    private Map<String, FlatFileItemWriter<YourEntry>> writers = new HashMap<>();

    private LineAggregator<YourEntry> lineAggregator;

    private ExecutionContext executionContext;

    @Override
    public void open(ExecutionContext executionContext) throws ItemStreamException {
        this.executionContext = executionContext;
    }

    @Override
    public void update(ExecutionContext executionContext) throws ItemStreamException {
    }

    @Override
    public void close() throws ItemStreamException {
        for(FlatFileItemWriter f:writers.values()){
            f.close();
        }
    }

    @Override
    public void write(List<? extends YourEntry> items) throws Exception {
        for (YourEntry item : items) {
            FlatFileItemWriter<YourEntry> ffiw = getFlatFileItemWriter(item);
            ffiw.write(Arrays.asList(item));
        }
    }

    public LineAggregator<YourEntry> getLineAggregator() {
        return lineAggregator;
    }

    public void setLineAggregator(LineAggregator<YourEntry> lineAggregator) {
        this.lineAggregator = lineAggregator;
    }


    public FlatFileItemWriter<YourEntry> getFlatFileItemWriter(YourEntry item) {
        String key = item.FileName();
        FlatFileItemWriter<YourEntry> rr = writers.get(key);
        if(rr == null){
            rr = new FlatFileItemWriter<>();
            rr.setLineAggregator(lineAggregator);
            try {
                UrlResource resource = new UrlResource("file:"+key);
                rr.setResource(resource);
                rr.open(executionContext);
            } catch (MalformedURLException e) {
                e.printStackTrace();
            }
            writers.put(key, rr);
            //rr.afterPropertiesSet();
        }
        return rr;
    }
}

and configure it as a writer:

<bean id="csvWriter" class="com....DynamicItemWriter">
        <property name="lineAggregator">
        <bean
         class="org.springframework.batch.item.file.transform.DelimitedLineAggregator">
            <property name="delimiter" value=","/>
            <property name="fieldExtractor" ref="csvFieldExtractor"/>
        </bean>
    </property>
Liviu Stirb
  • 5,876
  • 3
  • 35
  • 40
  • Thanks Gilles for your answer and time. Because of this complex requirement in nature I gave up on customizing the framework for my needs so instead I handled the file writing part using a simple tasklet (of course, restarting the batch job is challenging) – forumuser1 Apr 30 '13 at 13:32
  • @forumuser1 How did you do it btw ?, I am having the same issue and dynamically creating ItemWriters give many errors ... – Adelin May 09 '19 at 12:49
  • Thanks for this useful answer. In my case, I had a "File not writable issue" on the second writer.open, so I had to set @StepScope to the writer. Issue related here, because 'restarted' was true after 1st time : https://stackoverflow.com/a/26729596/2641426 – DependencyHell Jun 19 '20 at 09:44
2

In spring-batch, you can do this using ClassifierCompositeItemWriter.

Since ClassifierCompositeItemWriter gives you access to your object during write, you can write custom logic to instruct spring to write to different files.

Take a look at below sample. The ClassifierCompositeItemWriter needs an implementation of Classifier interface. Below you can see that I have created a lambda where I am implementing the classify() method of the Classifier interface. The classify() method is where you will create your ItemWriter. In our example below, we have created a FlatFileItemWriter which gets the name of the file from the item itself and then creates a resource for that.

@Bean
public ClassifierCompositeItemWriter<YourDataObject> yourDataObjectItemWriter(
    Classifier<YourDataObject, ItemWriter<? super YourDataObject>> itemWriterClassifier
) {
  ClassifierCompositeItemWriter<YourDataObject> compositeItemWriter = new ClassifierCompositeItemWriter<>();
  compositeItemWriter.setClassifier(itemWriterClassifier);
  return compositeItemWriter;
}

@Bean
public Classifier<YourDataObject, ItemWriter<? super YourDataObject>> itemWriterClassifier() {
  return yourDataObject -> {
    String fileName = yourDataObject.getFileName();

    BeanWrapperFieldExtractor<YourDataObject> fieldExtractor = new BeanWrapperFieldExtractor<>();
    fieldExtractor.setNames(new String[]{"recId", "name"});
    DelimitedLineAggregator<YourDataObject> lineAggregator = new DelimitedLineAggregator<>();
    lineAggregator.setFieldExtractor(fieldExtractor);

    FlatFileItemWriter<YourDataObject> itemWriter = new FlatFileItemWriter<>();
    itemWriter.setResource(new FileSystemResource(fileName));
    itemWriter.setAppendAllowed(true);
    itemWriter.setLineAggregator(lineAggregator);
    itemWriter.setHeaderCallback(writer -> writer.write("REC_ID,NAME"));

    itemWriter.open(new ExecutionContext());
    return itemWriter;
  };
}

Finally, you can attach your ClassifierCompositeItemWriter in your batch step like you normally attach your ItemWriter.

@Bean
public Step myCustomStep(
    StepBuilderFactory stepBuilderFactory
) {
  return stepBuilderFactory.get("myCustomStep")
      .<?, ?>chunk(1000)
      .reader(myCustomReader())
      .writer(yourDataObjectItemWriter(itemWriterClassifier(null)))
      .build();
}

NOTE: As pointed out in comments by @Ping, a new writer will be created for each chunk, which is usually a bad practice and not an optimal solution. A better solution would be to maintain a hashmap of filename and writer so that you can reuse the writer.

Rash
  • 7,677
  • 1
  • 53
  • 74
  • In the above example, will a new writer be created for each chunk? If not, how does the `ClassifierCompositeItemWriter` know that it needs to reuse a `ItemWriter` based on some attribute in `YourObject`? – Ping Nov 16 '19 at 07:47
  • In above example, a new writer will be created for each chunk because `itemWriterClassifier` is called on every object it wants to write and creates a new writer on the spot. That is bad obviously, and was written only for demonstration purposes. But I think you can easily save the reference to the writer and re-use it. https://stackoverflow.com/questions/53501152/how-to-use-classifier-with-classifiercompositeitemwriter – Rash Nov 16 '19 at 15:49
  • 1
    The example that you linked to doesn't consider dynamic file names. There are a fixed set of writer beans to chose from which is known at compile time itself rather than runtime. In the current example from the question, if the OP gets `file5` in the database FILE_NAME column in the future, there is no writer available to handle it. Writers should be instantiated at runtime and passed the file name to write to. Writers should also be cached so they can be reused if writing to the same file again. – Ping Nov 16 '19 at 18:23
  • You can simply create a hashmap or some key-value pair to dynamically create a writer if it doesn't exist, and then use the same one if it does exist. – Rash Nov 16 '19 at 18:52
  • 1
    Exactly. I believe this needs to be added to the answer? – Ping Nov 17 '19 at 03:30