0

I am writing some codes which are supposed to run (as jar) on both flink and spark platforms. However, these two platforms use different log APIs. (flink uses log4j as logging framework, but slf4j as API) In this case, what is the best practice to log in the common codes ?

I tried with Log4j2 API in these common codes, but it cannot log anything in flink.

My idea now would be trying to get the logging context with log4j API from the slf4j context (which is already launched by flink), is that correct?

Thanks

Vulcann
  • 177
  • 12

1 Answers1

1

Definitely a safe way to go would be to use SLF4J from a shared common library. Since SLF4J is a logging facade, you don't have to force your users to use the same logging framework you're using in your library. See the user manual to this point:

Authors of widely-distributed components and libraries may code against the SLF4J interface in order to avoid imposing an logging framework on their end-user. Thus, the end-user may choose the desired logging framework at deployment time by inserting the corresponding slf4j binding on the classpath, which may be changed later by replacing an existing binding with another on the class path and restarting the application. This approach has proven to be simple and very robust.

Dovmo
  • 8,121
  • 3
  • 30
  • 44