I am developing a Spring Boot based application in which I would like to create 2 beans: One will point to 'Oracle' database; the other will point to Hive. I've declared them as follows:
public @Bean
BoneCPDataSource metadataDataSource() {
BoneCPDataSource boneCPDataSource = new BoneCPDataSource();
boneCPDataSource.setDriverClass(getDriver());
boneCPDataSource.setJdbcUrl(getJdbcUrl());
boneCPDataSource.setUser(getUser());
boneCPDataSource.setPassword(getPassword());
boneCPDataSource.setMaxConnectionsPerPartition(5);
boneCPDataSource.setPartitionCount(5);
return boneCPDataSource;
}
public @Bean
BasicDataSource hiveDataSource() {
BasicDataSource basicDataSource = new BasicDataSource();
// Note: In a separate command window, use port forwarding like this:
//
// ssh -L 127.0.0.1:9996:<server>:<port> -l <userid> <server>
//
// and then login as the generic user.
basicDataSource.setDriverClassName("org.apache.hadoop.hive.jdbc.HiveDriver");
basicDataSource.setUrl("jdbc:hive://127.0.0.1:9996:10000/mytable");
return basicDataSource;
}
Problem is at the startup I am getting this:
Exception in thread "main"
org.springframework.beans.factory.BeanCreationException: Error
creating bean with name
'org.springframework.boot.actuate.autoconfigure.EndpointAutoConfiguration':
Injection of autowired dependencies failed; nested exception is
org.springframework.beans.factory.BeanCreationException: Could not
autowire field: private javax.sql.DataSource
org.springframework.boot.actuate.autoconfigure.EndpointAutoConfiguration.dataSource;
nested exception is
org.springframework.beans.factory.NoUniqueBeanDefinitionException: No
qualifying bean of type [javax.sql.DataSource] is defined: expected
single matching bean but found 2: metadataDataSource,hiveDataSource
Mainly because both of them inherit from javax.sql.DataSource. What's the best way to fix this?
EDIT:
Now I've declared them as follows:
public @Bean (name="metadataDataSource")
BoneCPDataSource metadataDataSource() {
BoneCPDataSource boneCPDataSource = new BoneCPDataSource();
boneCPDataSource.setDriverClass(getDriver());
boneCPDataSource.setJdbcUrl(getJdbcUrl());
boneCPDataSource.setUser(getUser());
boneCPDataSource.setPassword(getPassword());
boneCPDataSource.setMaxConnectionsPerPartition(5);
boneCPDataSource.setPartitionCount(5);
return boneCPDataSource;
}
public @Bean (name="hiveDataSource")
BasicDataSource hiveDataSource() {
BasicDataSource basicDataSource = new BasicDataSource();
// Note: In a separate command window, use port forwarding like this:
//
// ssh -L 127.0.0.1:9996:<server>:<port> -l <userid> <server>
//
// and then login as the generic user.
basicDataSource.setDriverClassName("org.apache.hadoop.hive.jdbc.HiveDriver");
basicDataSource.setUrl("jdbc:hive://127.0.0.1:9996:10000/mytable");
return basicDataSource;
}
And got this exception:
Exception in thread "main"
org.springframework.beans.factory.BeanCreationException: Error
creating bean with name
'org.springframework.boot.actuate.autoconfigure.EndpointAutoConfiguration':
Injection of autowired dependencies failed; nested exception is
org.springframework.beans.factory.BeanCreationException: Could not
autowire field: private javax.sql.DataSource
org.springframework.boot.actuate.autoconfigure.EndpointAutoConfiguration.dataSource;
nested exception is
org.springframework.beans.factory.NoUniqueBeanDefinitionException: No
qualifying bean of type [javax.sql.DataSource] is defined: expected
single matching bean but found 2: metadataDataSource,hiveDataSource
The other classes are referring to these beans as follows:
public class MetadataProcessorImpl implements MetadataProcessor {
@Autowired
@Qualifier("metadataDataSource")
BoneCPDataSource metadataDataSource;
@Controller public class HiveController {
@Autowired
@Qualifier("hiveDataSource")
BasicDataSource hiveDataSource;