0

I have a Python Spark streaming application submitted to standalone cluster. I'd like to log specific informations to a custom file and I tried all solutions I found until now:

  • tried to use log4j instance
  • python logging module
  • redirect output to file

Documentation is pretty poor about custom logging especially in Python env. But question its simple enough. How can I log custom data to file without parsing the whole stdout or checking the interactive log? Should be an easy peasy task but I'm stucking into it for few days.

Thanks,

FB

zero323
  • 322,348
  • 103
  • 959
  • 935
FrankBr
  • 886
  • 2
  • 16
  • 37
  • 1
    Possible duplicate of [PySpark logging from the executor](http://stackoverflow.com/questions/40806225/pyspark-logging-from-the-executor) – zero323 Apr 02 '17 at 06:57
  • Mine is a standalone cluster. You pointed a out a question where YARN is involved. – FrankBr Apr 03 '17 at 11:08

0 Answers0