Spark 1.6.2 (YARN master)
Package name: com.example.spark.Main
Basic SparkSQL code
val conf = new SparkConf()
conf.setAppName("SparkSQL w/ Hive")
val sc = new SparkContext(conf)
val hiveContext = new HiveContext(sc)
import hiveContext.implicits._
// val rdd = <some RDD making>
val df = rdd.toDF()
df.write.saveAsTable("example")
And stacktrace...
No X11 DISPLAY variable was set, but this program performed an operation which requires it.
at java.awt.GraphicsEnvironment.checkHeadless(GraphicsEnvironment.java:204)
at java.awt.Window.<init>(Window.java:536)
at java.awt.Frame.<init>(Frame.java:420)
at java.awt.Frame.<init>(Frame.java:385)
at com.trend.iwss.jscan.runtime.BaseDialog.getActiveFrame(BaseDialog.java:75)
at com.trend.iwss.jscan.runtime.AllowDialog.make(AllowDialog.java:32)
at com.trend.iwss.jscan.runtime.PolicyRuntime.showAllowDialog(PolicyRuntime.java:325)
at com.trend.iwss.jscan.runtime.PolicyRuntime.stopActionInner(PolicyRuntime.java:240)
at com.trend.iwss.jscan.runtime.PolicyRuntime.stopAction(PolicyRuntime.java:172)
at com.trend.iwss.jscan.runtime.PolicyRuntime.stopAction(PolicyRuntime.java:165)
at com.trend.iwss.jscan.runtime.NetworkPolicyRuntime.checkURL(NetworkPolicyRuntime.java:284)
at com.trend.iwss.jscan.runtime.NetworkPolicyRuntime._preFilter(NetworkPolicyRuntime.java:164)
at com.trend.iwss.jscan.runtime.PolicyRuntime.preFilter(PolicyRuntime.java:132)
at com.trend.iwss.jscan.runtime.NetworkPolicyRuntime.preFilter(NetworkPolicyRuntime.java:108)
at org.apache.commons.logging.LogFactory$5.run(LogFactory.java:1346)
at java.security.AccessController.doPrivileged(Native Method)
at org.apache.commons.logging.LogFactory.getProperties(LogFactory.java:1376)
at org.apache.commons.logging.LogFactory.getConfigurationFile(LogFactory.java:1412)
at org.apache.commons.logging.LogFactory.getFactory(LogFactory.java:455)
at org.apache.commons.logging.LogFactory.getLog(LogFactory.java:657)
at org.apache.hadoop.hive.shims.HadoopShimsSecure.<clinit>(HadoopShimsSecure.java:60)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:264)
at org.apache.hadoop.hive.shims.ShimLoader.createShim(ShimLoader.java:146)
at org.apache.hadoop.hive.shims.ShimLoader.loadShims(ShimLoader.java:141)
at org.apache.hadoop.hive.shims.ShimLoader.getHadoopShims(ShimLoader.java:100)
at org.apache.spark.sql.hive.client.ClientWrapper.overrideHadoopShims(ClientWrapper.scala:116)
at org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.scala:69)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:249)
at org.apache.spark.sql.hive.HiveContext.metadataHive$lzycompute(HiveContext.scala:345)
at org.apache.spark.sql.hive.HiveContext.metadataHive(HiveContext.scala:255)
at org.apache.spark.sql.hive.HiveContext.setConf(HiveContext.scala:459)
at org.apache.spark.sql.hive.HiveContext.defaultOverrides(HiveContext.scala:233)
at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:236)
at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:101)
at com.example.spark.Main1$.main(Main.scala:52)
at com.example.spark.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
ivysettings.xml file not found in HIVE_HOME or HIVE_CONF_DIR,/etc/hive/2.5.3.0-37/0/ivysettings.xml will be used
This same code was working a week ago on a fresh HDP cluster, and it works fine in the sandbox... the only thing I remember doing was trying to change around the JAVA_HOME variable, but I am fairly sure I undid those changes.
I'm at a loss - not sure how to start tracking down the issue.
The cluster is headless, so of-course it has no X11 display, but what piece of new HiveContext
even needs to pop-up any JFrame
?
Based on the logs, I'd say it's a Java configuration issue I messed up and something within org.apache.hadoop.hive.shims.HadoopShimsSecure.<clinit>(HadoopShimsSecure.java:60)
got triggered, therefore a Java security dialog is appearing, but I don't know.
Can't do X11 forwarding, and tried to do export SPARK_OPTS="-Djava.awt.headless=true"
before a spark-submit
, and that didn't help.
Tried these, but again, can't forward and don't have a display
- Getting a HeadlessException: No X11 DISPLAY variable was set
- "No X11 DISPLAY variable" - what does it mean?
The error seems to be reproducible on two of the Spark clients.
Only on one machine did I try changing JAVA_HOME
.
Did an Ambari Hive service-check. Didn't fix it.
Can connect fine to Hive database via Hive/Beeline CLI