1

I'm using Spark-2.4 and I'm trying to access the log4j logger from within a PandasUDF function The way it can be done on the driver:

sc = SparkContext(conf=conf) 
log4jLogger = sc._jvm.org.apache.log4j 
log = log4jLogger.LogManager.getLogger(__name__) 
log.warn("Hello World!")

Can someone please post an example?

Modi
  • 2,200
  • 4
  • 23
  • 37
  • Possible duplicate of [Calling Java/Scala function from a task](https://stackoverflow.com/questions/31684842/calling-java-scala-function-from-a-task) – 10465355 Jan 21 '19 at 14:17
  • And for actual logging from executor - [PySpark logging from the executor](https://stackoverflow.com/questions/40806225/pyspark-logging-from-the-executor/40839220#40839220) – 10465355 Jan 21 '19 at 14:19

0 Answers0