1

I'm having trouble with Spring Batch regarding the configuration of my custom writer which is basically a RepositoryItemWriter

@Bean
@StepScope
public ItemReader<DTO> itemReader() {
    [...]Reading from database and mapping into DTO class
    return reader;
}

@Bean
@StepScope
public ItemProcessor<DTO, Entity> itemProcessor(mapper) {
        return dto-> {
            dto.check();
            return mapper.toEntity(dto);
        };
}

@Bean
@StepScope
public ItemWriter<Entity> itemWriter() {
   [...]Save into database from repository
   return writer;
}

@Bean
public Step step() {
    return stepBuilderFactory.get("step")
            .<DTO, Entity>chunk(500)
            .reader(itemReader)
            .writer(itemWriter)
            .build();
}

I am using mapstruct to map DTO to Entity within the processor. Even though it seems to be right, my writer is actually receiving DTO items instead of Entity and thus cannot persist them.

Some complementary but irrelevant information on the structure of the batch. I'm reading from a large file, splitting it into smaller files. Then I'm partitioning my step with a multi resource partitioner, processor is doing a few format controls then the writter just batch insert it into database.

Edit : I guess I could copy/paste the generated source but the MapperImpl is pretty straight forward :

   @Override
    public Entity toEntity(DTO dto) {
        if ( dto == null ) {
            return null;
        }
      Entity entity = new Entity();
      [Bunch of controls and mapping]
      return entity;
     }

That's pretty much it.

Thank you for your help

  • `my writer is actually receiving DTO items instead of Entity and thus cannot persist them.`: How did you come to that conclusion? If this is the case, I don't see how the code compiles. Are you using JPA? Are your entities attached to a persistence context before being written? That's probably the reason why they are not persisted. Which transaction manager do you use? That can play a role as well. – Mahmoud Ben Hassine May 27 '20 at 08:57
  • Thank you for your comment. Spring boot is really auto-configuring almost everything in this batch. To give you more context, at first I was only using the entity from reader to writer and had no issue persisting the entities. Due to processing needs, I decided to use a DTO and mapping it to my entity so it can match the database table. The writer throwing this `org.springframework.beans.NotReadablePropertyException: Invalid property 'id' of bean class [my.package.custom.dto.DTO]: Could not find field for property during fallback access!` and I see the list of DTO while debugging writer – Bruno Trinta May 27 '20 at 11:38
  • I actually saw your answer on this post [Customize parameters of a step in a spring batch applications](https://stackoverflow.com/questions/58636144/customize-parameters-of-a-step-in-spring-batch-application/58690386#58690386) and used your example to illustrate my batch. I will check if something to be added to my original post to better illustrate my batch. – Bruno Trinta May 27 '20 at 11:45
  • Thank you @MahmoudBenHassine and Alex, you were both right. Stupid mistake from my side, I took the example from the post I linked, but in my own batch, the processor was not use in the step. So it was going straight from reader to writer. That's what you get by coding in the middle of the night : confusion and stupid mistakes ^^. Sorry for this loss of time. – Bruno Trinta May 27 '20 at 11:56
  • ok no worries, good to know you fixed your issue. – Mahmoud Ben Hassine May 27 '20 at 11:57
  • Tank you for your help. @MahmoudBenHassine does it seem right that the compilation worked though ? My reader is FlatFileItemReader and writer is a ItemWriter – Bruno Trinta May 27 '20 at 12:06
  • yes, that's because you specified the input/output types in `.chunk(500)`. – Mahmoud Ben Hassine May 27 '20 at 12:10

2 Answers2

0

return mapper.toEntity(dto);

Perhaps, problem is in mapper implementation. Its hard to say how mapper works without implementation source

Alex
  • 706
  • 7
  • 16
  • I edited my question to add the MapperImpl generated code because I wasn't able to properly add the code within the comment. – Bruno Trinta May 27 '20 at 08:38
  • `.processor(itemProcessor())` Why doesn't it takes any arg if your ItemProcessor is with mapper arg `public ItemProcessor itemProcessor(mapper)` – Alex May 27 '20 at 09:10
  • I'm actually using bean injection, the parenthesis were a mistake from the code I wrote to illustrate my post. – Bruno Trinta May 27 '20 at 09:16
  • Check if you don't have any more `itemProcessor` bean. – Alex May 27 '20 at 09:23
  • I have no other bean of this type and I'm using qualifier annotation to prevent those kind of problems. I have restarted and invalidate cache + rebuild the application with a maven clean install. No improvements. I'll remove the lambda in the processor to investigate and I will keep you updated. – Bruno Trinta May 27 '20 at 10:13
0

Mistake from coding during the night I guess. The processor was not declared for the step, so items were going straight from reader to writer without being processed and transform as entities.

@Bean
@StepScope
public ItemReader<DTO> itemReader() {
    [...]Reading from database and mapping into DTO class
    return reader;
}

@Bean
@StepScope
public ItemProcessor<DTO, Entity> itemProcessor(mapper) {
        return dto-> {
            dto.check();
            return mapper.toEntity(dto);
        };
}

@Bean
@StepScope
public ItemWriter<Entity> itemWriter() {
   [...]Save into database from repository
   return writer;
}

@Bean
public Step step() {
    return stepBuilderFactory.get("step")
            .<DTO, Entity>chunk(500)
            .reader(itemReader)
            .processor(itemProcessor) //Edit with solution : This line was missing
            .writer(itemWriter)
            .build();
}

Still wondering it should have compiled though.