I've recently started using SparkR and would like to do logging with it while running sql queries, performing joins, moving data into csv etc. I'm able to run sql queries and fecthing data from oracle and hive tables into SparkR dataframe but it doesn't permit to run loginfo("%s",emp_df). (Getting an S4 error below):
Error in as.character.default() : no method for coercing this S4 class to a vector
below is the code used to read the logging library. library(logging)
Creating a log file using file handler
`logReset()
basicConfig()
addHandler(writeToFile, file="Rlog.log")
with(getLogger(), names(as.vector(handlers)))'
I have gone through the logging documentation and I found "addHandler and removeHandler are also offered as methods of the Logger S4 class. in that case there is no logger argument" But don't get much help ....Could anyone please tell me how to overcome this..thanks for help in advance