0

I am trying to use logback with spark-thrift-server to log the application logs in elasticsearch and for that i am using log4j-over-slf4j along with logback-classic and logback-core.

Service fails to start with this Exception:-

Exception in thread "main" java.lang.NoSuchMethodError: org.apache.hive.service.cli.operation.LogDivertAppender.setWriter(Ljava/io/Writer;)V
        at org.apache.hive.service.cli.operation.LogDivertAppender.<init>(LogDivertAppender.java:166)
        at org.apache.hive.service.cli.operation.OperationManager.initOperationLogCapture(OperationManager.java:84)
        at org.apache.hive.service.cli.operation.OperationManager.init(OperationManager.java:62)
        at org.apache.hive.service.CompositeService.init(CompositeService.java:59)
        at org.apache.hive.service.cli.session.SessionManager.init(SessionManager.java:83)
        at org.apache.spark.sql.hive.thriftserver.SparkSQLSessionManager.init(SparkSQLSessionManager.scala:44)
        at org.apache.spark.sql.hive.thriftserver.ReflectedCompositeService.$anonfun$initCompositeService$1(SparkSQLCLIService.scala:123)
        at org.apache.spark.sql.hive.thriftserver.ReflectedCompositeService.$anonfun$initCompositeService$1$adapted(SparkSQLCLIService.scala:123)
        at scala.collection.Iterator.foreach(Iterator.scala:941)
        at scala.collection.Iterator.foreach$(Iterator.scala:941)
        at scala.collection.AbstractIterator.foreach(Iterator.scala:1429)
        at scala.collection.IterableLike.foreach(IterableLike.scala:74)
        at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
        at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
        at org.apache.spark.sql.hive.thriftserver.ReflectedCompositeService.initCompositeService(SparkSQLCLIService.scala:123)
        at org.apache.spark.sql.hive.thriftserver.ReflectedCompositeService.initCompositeService$(SparkSQLCLIService.scala:120)
        at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIService.initCompositeService(SparkSQLCLIService.scala:41)
        at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIService.init(SparkSQLCLIService.scala:93)
        at org.apache.spark.sql.hive.thriftserver.ReflectedCompositeService.$anonfun$initCompositeService$1(SparkSQLCLIService.scala:123)
        at org.apache.spark.sql.hive.thriftserver.ReflectedCompositeService.$anonfun$initCompositeService$1$adapted(SparkSQLCLIService.scala:123)
        at scala.collection.Iterator.foreach(Iterator.scala:941)
        at scala.collection.Iterator.foreach$(Iterator.scala:941)
        at scala.collection.AbstractIterator.foreach(Iterator.scala:1429)
        at scala.collection.IterableLike.foreach(IterableLike.scala:74)
        at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
        at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
        at org.apache.spark.sql.hive.thriftserver.ReflectedCompositeService.initCompositeService(SparkSQLCLIService.scala:123)
        at org.apache.spark.sql.hive.thriftserver.ReflectedCompositeService.initCompositeService$(SparkSQLCLIService.scala:120)
        at org.apache.spark.sql.hive.thriftserver.HiveThriftServer2.initCompositeService(HiveThriftServer2.scala:124)
        at org.apache.spark.sql.hive.thriftserver.HiveThriftServer2.init(HiveThriftServer2.scala:144)
        at org.apache.spark.sql.hive.thriftserver.HiveThriftServer2$.startWithContext(HiveThriftServer2.scala:63)
        at org.apache.spark.sql.hive.thriftserver.HiveThriftServer2$.main(HiveThriftServer2.scala:104)
        at org.apache.spark.sql.hive.thriftserver.HiveThriftServer2.main(HiveThriftServer2.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
        at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:951)
        at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
        at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1030)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1039)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

I searched and found this :- https://issues.apache.org/jira/browse/SPARK-14703, in last comment of this issue same thing reported but i can not get a valid solution for this issue. Disabling this property[hive.server2.logging.operation.enabled] won't help me as i need logs and disabling this stop the logging, i have tested it.

It seems Hive use WriterAppender.setWriter() of log4j for logging but the Log4j bridge (log4j-over-slf4j library JAR) does not implement this method.

Surya
  • 3
  • 4

0 Answers0