-1

I am using spring boot starter 2.2.0.RELEASE for my application. I have one kafka consumer. Now I want to inject my spring entities into my kafka consumer (kafka consumer is not managed by spring container).

I tried ApplicationContextAware but it didn't helped me. I am getting applicationContext as null in my kafka consumer and hence I am not able to get bean from spring container. First time applicationContext is getting set properly but when context loads second time it set to null. Below are the details of my application in short

@SpringBootApplication
@ComponentScan({"com.xyz.config_assign.service"})
public class ConfigAssignServer {

    private static Logger log = LoggerFactory.getLogger(ConfigAssignServer.class);

    public static void main(String[] args) {
        ConfigurableApplicationContext applicationContext = SpringApplication.run(ConfigAssignServer.class, args);

        log.info("Started ConfigAssign Server!!! AppName= {} ", applicationContext.getApplicationName());

       QkafkaClient.loadConfiguration();


    }

}

All my application classes are present in com.xyz.config_assign.service so no issue of bean scanning problem. It worked well before I add Kafka consumer

My ApplicationContextProvider which is using famous ApplicationContextAware

@Component
public class ApplicationContextProvider implements ApplicationContextAware{

    public static ApplicationContext applicationContext;


    public void setApplicationContext(ApplicationContext applicationContext) throws BeansException {
        ApplicationContextProvider.applicationContext = applicationContext;
    }

    public static ApplicationContext getApplicationContext() {
        return applicationContext;
    }
}

And now my kafkaconsumer

    public class ConfigAssignmentAssetTagChangeKafkaTopicProcessor implements BatchTopicProcessor<String, String> {

    private Logger log = LoggerFactory.getLogger(ConfigAssignmentAssetTagChangeKafkaTopicProcessor.class);


    private AssignConfigServiceImpl assignConfigServiceImpl;

    public  ConfigAssignmentAssetTagChangeKafkaTopicProcessor(){
        ApplicationContext applicationContext = ApplicationContextProvider.getApplicationContext();
        assignConfigServiceImpl = applicationContext.getBean(AssignConfigServiceImpl.class);
    }

    @Override
    public void handleError(ConsumerRecords<String, String> arg0, Exception arg1) {
        // TODO Auto-generated method stub

    }

    @Override
    public void process(ConsumerRecords<String, String> records, long arg1) {}
}

And the service I want to inject into kafka consumer is

   @Service
public class AssignConfigServiceImpl implements AssignConfigService {

    private Logger log = LoggerFactory.getLogger(AssignConfigServiceImpl.class);

    @Autowired
    private ConfigProfileDBService dbService;



    public boolean assignConfigToAgents(List<UUID> agentList, long customerId) {
        List<AgentConfigurationProfile> configProfiles = 
        dbService.getConfigurationProfilesForCustomer(customerId);
        boolean isAssignSuccessful = assignConfigToAgents(agentList, customerId, configProfiles);
        log.info("Config Assignment status ={}", isAssignSuccessful);

        return isAssignSuccessful;

    }

The other service I used is

@Service
public class ConfigProfileDBService implements DBService {

    private static Logger log = LoggerFactory.getLogger(ConfigProfileDBService.class);

    @Autowired
    private JdbcTemplate jdbcTemplate;


    public List<AgentConfigurationProfile> getConfigurationProfilesForCustomer(Long customerId) {
        List<AgentConfigurationProfile> agentConfigList;
}

}

Can someone please let me know what went wrong, I tried multiple online solution but didn't worked for me. Thanks in advance. Note : I haven't initialized any class using new operator.

Datta
  • 169
  • 1
  • 1
  • 10
  • 1
    "context managed by spring" is by definition mostly just that: a set of beans where dependency injection works... objects "out of the context" are, by definition, those where it doesn't. Your question just states the obvious. – fdreger Feb 26 '20 at 07:51
  • yes but we can use spring beans outside spring managed entities ( I referred few links https://stackoverflow.com/questions/18347518/spring-autowiring-not-working-from-a-non-spring-managed-class https://dzone.com/articles/autowiring-spring-beans-into-classes-not-managed-by-spring – Datta Feb 26 '20 at 07:55
  • Could you please clarify what meand "it didn't work till now" - are you getting an exception? If so, please add to the question. Also if you could explain why not moving kafka consumer to be spring managed (from the question I understand that you can't do that and hence you need hacks with static holder for Application context)? I'm asking because probably there are other workarounds that could be suggested by the community – Mark Bramnik Feb 26 '20 at 07:55
  • @MarkBramnik I am getting applicationContext as null in my kafka consumer – Datta Feb 26 '20 at 07:57
  • @datta: you can use any object anywhere if you have a reference to it. Being a "spring bean" means just that spring created the object. Whoever creates an object (gets to call `new`) can initialize it (i.e. inject things). Spring does not (and can not) magically intercept all the Java `new`. – fdreger Feb 26 '20 at 08:02

2 Answers2

1

Based on clarifications in question and comments, and assuming its impossible to move KafkaConsumer to be spring managed (which is I believe the best solution):

@Autowired won't work in KafkaConsumer, so no need to put that annotation.

I assume that you're getting null for Application context in this line:

 ApplicationContext applicationContext = ApplicationContextProvider.getApplicationContext();

This means that ApplicationContextProvider#setApplicationContext has not been called by the time you create Kafka Consumer. Spring in addition to injection also manages lifecyle of the managed objects. Since you're not in spring, you're "on-your-own" and you have to make sure that Application Context gets created first and only after that create other objects (like Kafka Consumer for example).

When application context starts it adds beans one by one at some point, among other beans it also gets to the ApplicationContextProvider and calls its setter.

Mark Bramnik
  • 39,963
  • 4
  • 57
  • 97
  • Thanks @Marks I debugged my application and found that ApplicationContextProvider#setApplicationContext is being called before my kafka consumer constructor gets called. – Datta Feb 26 '20 at 08:56
  • @Autowired removed. Still the same issue. Getting applicationContext as null. Thanks – Datta Feb 26 '20 at 09:00
  • @Datta : `@Autowired` wasn't a cause - just an unnecessary thing that should be removed. Sure, it will be null, and the reason is exactly as I've said in the answer and you've confirmed that in your first comment... – Mark Bramnik Feb 26 '20 at 09:09
  • Mark Bramnik Thanks . But as per your suggestion "make sure that Application Context gets created first and only after that create other objects" I have debugged my application and found that before creating Kafka object, applicationContext is being set i.e. ApplicationContextProvider#setApplicationContext is being called before my kafka consumer constructor gets called – Datta Feb 26 '20 at 09:25
  • In this case where exactly do you have null? ` ApplicationContext applicationContext = ApplicationContextProvider.getApplicationContext();` from your comment - it can't be in this line... – Mark Bramnik Feb 26 '20 at 09:38
  • yes I am getting null exactly at that line. Wondering what went wrong. I have done all things properly by referring ==> ( https://dzone.com/articles/autowiring-spring-beans-into-classes-not-managed-by-spring) – Datta Feb 26 '20 at 09:54
  • 1
    Any chance its related to different classloaders somehow? I mean ApplicationContextProvider.setApplicationContext is called on class loaded by one classloader and in Kafka Consumer you use another... I don't know what do you run, so I can only speculate, but from the "pure java" standpoint it should not be null... Another possible option is that someone later sets "null" somehow to application context provider before you call it from KafkaConsumer – Mark Bramnik Feb 26 '20 at 10:24
  • I noticed my kafka consumer class is loading twice and that too by different class loader. ConfigAssignServer : Parent Class Loader : sun.misc.Launcher$AppClassLoader KafkaConfiguration : Current Class Loader : org.springframework.boot.devtools.restart.classloader.RestartClassLoader KafkaConfiguration : Parent Class Loader : sun.misc.Launcher$AppClassLoader ConfigAssignmentAssetTagChangeKafkaTopicProcessor : Current Class Loader : sun.misc.Launcher$AppClassLoader ConfigAssignmentAssetTagChangeKafkaTopicProcessor : Parent Class Loader : sun.misc.Launcher$ExtClassLoader – Datta Feb 26 '20 at 15:55
0

My main problem was my spring context was being loaded twice. As I printed every class's class loader I found that my Application was running twice. (i.e. when I debug in intellij after pressing F9 I was again landing up on same line i.e.

ConfigurableApplicationContext applicationContext = SpringApplication.run(ConfigAssignServer.class, args);

My problem was in pom.xml. I removed below dependency from my pom and it worked.

<dependency>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter-web</artifactId>
    </dependency>

    <dependency>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-devtools</artifactId>
        <scope>runtime</scope>
    </dependency>

Please check for your dependencies. Hope it will help someone. Enjoy coding :)

Datta
  • 169
  • 1
  • 1
  • 10