1

Our application uses Oracle 11.2 as database. Because did not wanted to mix 'spring batch metadata' tables with regular application ones, got a new schema created. But, when trying to configure both separate datasources, keep getting below error :

       //configuration first datasource

        @Configuration
        @EnableBatchProcessing
        public class BatchConfig{
            private static final Logger logger = LoggerFactory.getLogger(ReutersMarketDataReadConfig.class);

            ..

            @Bean
            @ConfigurationProperties(prefix = "spring.batch.datasource")
            public DataSource getBatchDataSource() {
            return DataSourceBuilder.create().build();
            }
        ....
        }


        //second data source
        @Configuration
        @EnableTransactionManagement
        @EnableJpaRepositories(
            entityManagerFactoryRef = "appEntityManagerFactory",
            transactionManagerRef = "appTransactionManager",
            basePackages = {"com.xyz.abc.repository" }
        )
        public class ApplicationDBConfig {

            @Primary
            @Bean(name = "appDataSource")
            @ConfigurationProperties(prefix = "spring.datasource")
            public DataSource dataSource() {
            return DataSourceBuilder.create().build();
            }

            @Primary
            @Bean(name = "appEntityManagerFactory")
            public LocalContainerEntityManagerFactoryBean entityManagerFactory(EntityManagerFactoryBuilder builder,
                @Qualifier("appDataSource") DataSource dataSource) {
            return builder.dataSource(dataSource).packages("com.xyz.abc.model").persistenceUnit("app").build();
            }

            @Primary
            @Bean(name = "appTransactionManager")
            public PlatformTransactionManager transactionManager(
                @Qualifier("appEntityManagerFactory") EntityManagerFactory entityManagerFactory) {
            return new JpaTransactionManager(entityManagerFactory);
            }
        }

Error :

Error starting ApplicationContext. To display the conditions report re-run your application with 'debug' enabled.
[30m2019-04-02 14:56:16,706[0;39m [1;31mERROR[0;39m [[34mrestartedMain[0;39m] [33morg.springframework.boot.SpringApplication[0;39m: Application run failed
            org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'jobInvokerController': Unsatisfied dependency expressed through field 'processLiborFeedJob'; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'processLiborFeedJob' defined in class path resource [com/db/sts/marketdata/batch/config/ReutersMarketDataReadConfig.class]: Initialization of bean failed; nested exception is org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'taskBatchExecutionListener' defined in class path resource [org/springframework/cloud/task/batch/configuration/TaskBatchAutoConfiguration$TaskBatchExecutionListenerAutoconfiguration.class]: Unsatisfied dependency expressed through method 'taskBatchExecutionListener' parameter 0; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'org.springframework.cloud.task.configuration.SimpleTaskAutoConfiguration': Invocation of init method failed; nested exception is java.lang.IllegalStateException: To use the default TaskConfigurer the context must contain no more than one DataSource, found 2
                at org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor$AutowiredFieldElement.inject(AutowiredAnnotationBeanPostProcessor.java:596)
                at org.springframework.beans.factory.annotation.InjectionMetadata.inject(InjectionMetadata.java:90)
Michael Minella
  • 20,843
  • 4
  • 55
  • 67
Arpit S
  • 137
  • 2
  • 10
  • Per the exception, have you created a `TaskConfigurer`? – Michael Minella Apr 02 '19 at 19:57
  • (mistakenly left incomplete edit), added TaskConfigurer to the "second" datasource ,but then got exception(possibly because it seems like it is mixing the datasource , that is supposedly used by applications [for read/write] as the one that Spring Batch uses to write metadata -> Caused by: org.springframework.jdbc.BadSqlGrammarException: PreparedStatementCallback; bad SQL grammar [SELECT JOB_INSTANCE_ID, JOB_NAME from BATCH_JOB_INSTANCE where JOB_NAME = ? and JOB_KEY = ?]; nested exception is java.sql.SQLSyntaxErrorException: Table 'secondarydf.batch_job_instance' doesn't exist. – Arpit S Apr 03 '19 at 03:37
  • I need to see the code to be able to help... – Michael Minella Apr 03 '19 at 21:03
  • Added sample project to https://github.com/jobas2007/spring_proj – Arpit S Apr 04 '19 at 08:08
  • Have found a solution with multiple iterations that seem to working, the latest code is committed to github. Still unsure if the solution is very clean, as may be some extra code could be avoided. Please provide any suggestions to refactor or make it better? – Arpit S Apr 04 '19 at 23:35
  • Further - Also the next thing is attempting to deploy the 'app' to Spring Cloud Data Flow 2.x , but again ran into challenges earlier when connecting to Oracle - posted question https://stackoverflow.com/questions/55399872/getting-oracle-11-2-unsupported-error-when-using-spring-cloud-data-flow-2-0-1 (there also need to use seperate datasource) and also opened with SCDF github - https://github.com/spring-cloud/spring-cloud-dataflow/issues/3116#issuecomment-478788977 . Any suggestion is helpful – Arpit S Apr 04 '19 at 23:36

1 Answers1

0

The error Table 'secondarydf.batch_job_instance' doesn't exist. happens because by default, Spring Batch will use the dataSource bean which is annotated with @Primary and not the getBatchDataSource bean as you are expecting.

From the Javadoc of @EnableBatchProcessing annotation:

If multiple DataSources are defined in the context, the one annotated with Primary will be used

The related issue is BATCH-2537. So in your case, you can make your class BatchConfig extend DefaultBatchConfigurer and override setDataSource with the one you want to use for batch.

Mahmoud Ben Hassine
  • 28,519
  • 3
  • 32
  • 50
  • Thanks. tried adding BatchConfigurer override still all tables in application schema, the github link posted in previous project. Please clone project if time permits BatchConfigurer configurer(@Qualifier("batchDataSource") DataSource dataSource) { return new DefaultBatchConfigurer(batchDataSource()); } – Arpit S Apr 04 '19 at 15:53
  • Made it working with few tweaks , thanks for suggestions/help. Code updated to github, please also see new comments added in the previous posts. Now trying next steps to connect to Oracle 11.2, where flyway dependency with Spring Boot 2.x is causing issues . Will keep posted – Arpit S Apr 04 '19 at 23:38
  • Still Facing issue with able to use the Spring Task Project (with 2 datasources) with Spring Cloud DataFlow , as the the execution of "Task" is NOT triggering the underlying configured batch jobs. Posted the same to spring github https://github.com/spring-cloud/spring-cloud-task/issues/594 . – Arpit S Apr 14 '19 at 01:53
  • When using 2 datasource (one for spring metadata and second for application) Facing intermitted issues with "final" save/commits to the application datasource. As per some other posts https://stackoverflow.com/questions/25540502/use-of-multiple-datasources-in-spring-batch , it is suggested to use "ChainedTransactionManager" , does anyone has example or any other suggestion (as it appears that two separate tx contexts are running, causing unpredictable behavior)? – Arpit S Jun 08 '19 at 20:34