1

I have a job that writes each item in one separated file. In order to do this, the job uses a ClassifierCompositeItemWriter whose the ClassifierCompositeItemWriter returns a new FlatFileItemWriter for each item (code bellow).

    @Bean
    @StepScope
    public ClassifierCompositeItemWriter<ProcessorResult> writer(@Value("#{jobParameters['outputPath']}") String outputPath) {

        ClassifierCompositeItemWriter<MyItem> compositeItemWriter = new ClassifierCompositeItemWriter<>();
        
        compositeItemWriter.setClassifier((item) -> {
            
            String filePath = outputPath + "/" + item.getFileName();
            
            BeanWrapperFieldExtractor<MyItem> fieldExtractor = new BeanWrapperFieldExtractor<>();
            fieldExtractor.setNames(new String[]{"content"});
            
            DelimitedLineAggregator<MyItem> lineAggregator = new DelimitedLineAggregator<>();
            lineAggregator.setFieldExtractor(fieldExtractor);

            FlatFileItemWriter<MyItem> itemWriter = new FlatFileItemWriter<>();
            itemWriter.setResource(new FileSystemResource(filePath));
            itemWriter.setLineAggregator(lineAggregator);
            itemWriter.setShouldDeleteIfEmpty(true);
            itemWriter.setShouldDeleteIfExists(true);

            itemWriter.open(new ExecutionContext());
            return itemWriter;
            
        });
        
        return compositeItemWriter;

    }

Here's how the job is configured:

    @Bean
    public Step step1() {
        return stepBuilderFactory
                .get("step1")
                .<String, MyItem>chunk(1)
                .reader(reader(null))
                .processor(processor(null, null, null))
                .writer(writer(null))
                .build();
    }

    @Bean
    public Job job() {
        return jobBuilderFactory
                .get("job")
                .incrementer(new RunIdIncrementer())
                .flow(step1())
                .end()
                .build();
    }

Everything works perfectly. All the files are generated as I expected. However, one of the file cannot be deleted. Just one. If I try to delete it, I get a message saying that "OpenJDK Platform binary" is using it. If I increase the chunk to a size bigger that the amount of files I'm generating, none of the files can be deleted. Seems like there's an issue to delete the files generated in the last chunk, like if the respective writer is not being closed properly by the Spring Batch lifecycle or something.

If I kill the application process, I can delete the file.

Any I idea why this could be happening? Thanks in advance!

PS: I'm calling this "itemWriter.open(new ExecutionContext());" because if I don't, I get a "org.springframework.batch.item.WriterNotOpenException: Writer must be open before it can be written to".

EDIT:

If someone is facing a similar problem, I suggest reading the Mahmoud's answer to this question Spring batch : ClassifierCompositeItemWriter footer not getting called .

1 Answers1

1

Probably you are using the itemwriter outside of the step scope when doing this:

 itemWriter.open(new ExecutionContext());

Please check this question, hope that this helps you.

Ivan Costa
  • 166
  • 4
  • 1
    Thank you very much for answering. I think that the problem was exactly that. This link that you provided togheter with this other link https://stackoverflow.com/questions/67604628/spring-batch-classifiercompositeitemwriter-footer-not-getting-called helped solving my problem. The key point seems to be the requirement of registering writers as streams in the step so that open/update/close are called correctly. I think I can't choose this as the answer or upvote it because I dont have enough reputation though. –  Feb 23 '22 at 02:25
  • Also, engraçado que eu venho em site gringo pedir ajuda, e recebo ajuda de outro brasileiro kkkkkk –  Feb 23 '22 at 02:26
  • When you have enough reputation, please come back and do it. LOL! Tamo junto irmão. – Ivan Costa Feb 24 '22 at 02:21