3

I have found many examples to use multiple writers in this forum. Most, if not all, of the answers focus on CompositeItemWriter and ClassifierItemWriter.

Business Need: Read a single line from an input file. This line will contain multiple fields (over 50) which need to be written to their own database tables (in theory represent different classes).

                               ----- claimwriter(write to claim table)
                              /
                             /
claimlineitemprocessor  -----
                             \
                              \
                               ----- pharmacywriter(write to pharmacy table)

I have used a fieldset mapper to create the object representing the claim line (ClaimLine). Most of the fields are a simple mapping to the data in the file, but a few need to have their format changed or related field mapping logic.

Basic item writer code looks like this:

@SuppressWarnings({ "unchecked", "rawtypes" })
@Bean
public ItemWriter<ClaimLine> writer() {
    CompositeItemWriter<ClaimLine> cWriter = new CompositeItemWriter<ClaimLine>();

    JdbcBatchItemWriter claimWriter = new JdbcBatchItemWriter();
    claimWriter.setItemSqlParameterSourceProvider(new ClaimItemSqlParameterSourceProvider());
    claimWriter.setSql( // would like to insert into pharmacy table);
    claimWriter.setDataSource(dataSource);
    claimWriter.afterPropertiesSet();

    JdbcBatchItemWriter pharmacyWriter = new JdbcBatchItemWriter();
    pharmacyWriter.setItemSqlParameterSourceProvider(new PharmacyItemSqlParameterSourceProvider());
    pharmacyWriter.setSql( // would like to insert into pharmacy table);
    pharmacyWriter.setDataSource(dataSource);
    pharmacyWriter.afterPropertiesSet();

    List<ItemWriter<? super ClaimLine>> mWriter = new ArrayList<ItemWriter<? super ClaimLine>>();
    mWriter.add(claimWriter); 
    mWriter.add(pharmacyWriter);
    cWriter.setDelegates(mWriter);

    // other code

    return cWriter;
};

When creating the custom source providers, each of them seem to expect because that is the class that has already been mapped to the input line and contain the values I would like to send to the respective tables.

This is basically where I am now stuck thinking I cannot use a CompositeItemWriter because I am trying to transform a single object into two different ones. And ClassifierCompositeItemWriter works like a router and sends it down a path specific to a condition, which is not what I want to do.

For reference, I tried doing something similar with Spring Integration and also hit a similar roadblock.

Any help is appreciated.

Tony Edwards
  • 431
  • 1
  • 7
  • 18
  • are you doing chunk based processing? do you have a processor too in your spring batch step in addition to reader and writer? – Sabir Khan Sep 27 '16 at 15:41
  • Technically the chunks are 1 at a time due to a few different business issues. I do have a processor wired in but it does nothing at this time (just a code stub). – Tony Edwards Sep 27 '16 at 18:29
  • 2
    Why don't you just use two ItemPreparedStatementSetter? It gives you rhe possibility to map the fields of ClaimLine to the appropriate parameters of the two Insert-sqls – Hansjoerg Wingeier Sep 27 '16 at 19:56
  • 1
    You have to write a custom itemwriter able to perform multiple write as described in http://stackoverflow.com/questions/18999724/spring-batch-one-reader-multiple-processors-and-writers instead of return a CompositeItemWriter – Luca Basso Ricci Sep 28 '16 at 09:43
  • 1
    @Hansjoerg - Using ItemPreparedStatementSetter is not preferred because I would like the code to be more readable for developers. But your comment has made me understand the use of ItemSqlParameterSourceProvider better and I am retrying that now. – Tony Edwards Sep 28 '16 at 14:27

4 Answers4

0

I believe the comments from @Hansjoerg and @Luca provide valuable responses to this question and were researched prior to and during the research for an answer.

I was able to resolve this issue by continuing the use of a ItemSqlParameterSourceProvider and the code is below. When I initially explored how to use this class and its methods, my thought was that I was still only operating on the ClaimLine class.

What is truly happening is the method is receiving the class from the writer and you are setting the values of the SQL statement you set with setSQL(String sql). With the use of ItemSqlParameterSourceProvider, you are using the named parameters in the SQL for the put statement. The code below only shows the code for a claim. The pharmacy would be similar.

public class ClaimItemSqlParameterSourceProvider implements ItemSqlParameterSourceProvider<ClaimLine> {

    @SuppressWarnings({ "serial"})
    @Override
    public SqlParameterSource createSqlParameterSource(final ClaimLine item) {
        return new MapSqlParameterSource(new HashMap<String, Object>() {
            {
                put("rxclaimid", item.getRxClaimID());
                ...
                // many more
            }
        });
    }
}

A custom item writer may have also resolved this issue, but it seems to require more coding to support it. In the end, either using a ItemPreparedStatementSetter or a ItemSqlParameterSourceProvider should be fine for this case. The main reason we choose the latter is because the parameters are clearly named instead of accessing the parameter values by index values (1,2,3, etc) and using "?" in the setSQL call.

Tony Edwards
  • 431
  • 1
  • 7
  • 18
0

you can use chain to write in multiple tables ,

<int:chain input-channel="processTransactionChannel"
  output-channel="processedItems">
  <int:header-enricher>
   <int:header name="savePayload" expression="payload" />
  </int:header-enricher>

  <int-jpa:updating-outbound-gateway
   auto-startup="true"
   native-query="insert into  TableOne values( :transactionStatus ,bank_Reference_Number = :bankReferenceNumber )"
   entity-manager="entityManager" 
   use-payload-as-parameter-source="false">
   <int-jpa:transactional />
   <int-jpa:parameter name="transactionStatus"
    expression="payload['transactionStatus']" />
   <int-jpa:parameter name="bankReferenceNumber"
    expression="payload['bankReferenceNumber']" />

  </int-jpa:updating-outbound-gateway>

  <int:transformer expression="headers.savePayload" />

  
  <int-jpa:updating-outbound-gateway
   native-query="insert 
       into PARTNER_RESPONSE_DETAILS(PARTNER_ID,BANK_REFERENCE_NUMBER,REQUEST_STRING,RESPONSE_STRING)  
        values (:partnerId,:bankRefNumber,:requestString,:responseString)"
   entity-manager="entityManager">
   <int-jpa:transactional />
   <int-jpa:parameter name="partnerId" expression="payload['partnerId']" />
   <int-jpa:parameter name="bankRefNumber" expression="payload['bankRefNumber']" />
   <int-jpa:parameter name="requestString" expression="payload['requestString']" />
   <int-jpa:parameter name="responseString"
    expression="payload['responseString']" />
   <int-jpa:parameter name="transactionStatus"
    expression="payload['transactionStatus']" />
   <int-jpa:parameter name="bankReferenceNumber"
    expression="payload['bankReferenceNumber']" />

  </int-jpa:updating-outbound-gateway>

  <int:transformer expression="headers.savePayload" />


 </int:chain>

this code updates 2 tables and it is working for me.

Praveen Jain
  • 77
  • 3
  • 13
0

maybe my answer is your except. I also see the same problem, I used classifierCompositeWriter to resolve it. I define my classifier in my project, as you see, you can decide which writer you should use in your logic. for example, in my logic

if(baseEntity instanceof Product){ 
    return productItemWriter;
}else {
    return otherItemWriter;
}

good lucky.

Vimukthi
  • 846
  • 6
  • 19
0

Not clear on what you mean to say here:

When creating the custom source providers, each of them seem to expect because that is the class that has already been mapped to the input line and contain the values I would like to send to the respective tables.

But what I get from the question is you want to read one line from a file with N columns and break it down to write to two or more tables. Before that you want to change the format of some columns and derive some others.

That should be doable with CompositeItemWriter.

  1. Read the line from file
  2. In the mapper, create the class object and populate the fields. This is where you can also change the format of any field in the line and then assign to object property. Similarly, you can derive another property based on other properties in the read line and assign to a property in the object! So there would be a property for each of the fields you are interested in. You can keep one property for original field and one for reformatted field if you would need to refer to original data.
  3. This model instance would be passed to processor where you can process the data. Infact, this is another place where you can do transformations i.e., change the format and make derived columns!
  4. Make a writer for each of the tables that you want to write to. Assign a datasource to each of the writers. Then assign this list of writers to CompositeItemWriter. Note that this setup will work only if both writers connect to same data source and the same instance of data source is assigned to transaction manager of step. Otherwise rollbacks would cause data integrity issues.
  5. Create a ItemSqlParameterSourceProvider for each respective writer. The same data object with 50 or whatever number of properties is passed to each provider. but each provider returns a different object corresponding to the table you want to write to.
  6. Done! This should read the line from file, process each line item to transform as you wish then write to multiple tables in the same transaction. Commit and rollback would be consistent.

Model to read each line:

class LineModel
{
   public String tab1Prop1;
   public boolean tab1Prop2;
   public int tab1PropN;
   public String tab1ReformattedProp1;
   public String tab2DerivedFieldFromProp2AndN;
   public String tab2Prop3;
}

In the line mapper, you can populate above model mapping line fields to props and also reformatting and deriving other fields. Or do the reformatting and deriving in processor as that is what is supposed to be used for transformations.

Alternatively, you can compose above model with two models one for each table:

class LineModel
{
   public Tab1 tab1Cols;
   public Tab2 tab2Cols;
}

class Tab1
{
   public String tab1Prop1;
   public boolean tab1Prop2;
   public int tab1PropN;
   public String tab1ReformattedProp1;
}

class Tab2
{
   public String tab2DerivedFieldFromProp2AndN;
   public String tab2Prop3;
}

Now in each writer's ItemSqlParameterSourceProvider, return lineModelObj.tab1Cols or lineModelObj.tab2Cols depending upon which writer are you dealing with. At this point you would know which model's which properties would be written to which table so you can create the SQL in the writer accordingly accordingly.

And that's it! It should do your job!