4

I run Hadoop 3.0.0-alpha1 on windows and added Hive 2.1.1 to it. When I try to open the hive beeline with the hive command I get an error:

Error applying authorization policy on hive configuration: 
Couldn't create directory ${system:java.io.tmpdir}\${hive.session.id}_resources

Whats wrong?

I run mysql as metastore for Hive and added the required files in HDFS:

hadoop fs -mkdir /user/hive
hadoop fs -mkdir /user/hive/warehouse
hadoop fs -mkdir /tmp

After that I changed the permissions:

hadoop fs -chmod 777 /user/hive
hadoop fs -chmod 777 /user/hive/warehouse
hadoop fs -chmod 777 /tmp

YARN and DFS deamons are running as well as mysql, the mysql jdbc-driver is known to hadoop as well as to hive.

Benvorth
  • 7,416
  • 8
  • 49
  • 70
  • Look at the Hive documentation about "Configuration properties" https://cwiki.apache.org/confluence/display/Hive/Configuration+Properties and specifically `hive.exec.scratchdir` and the 3+ "scratchdir" props. – Samson Scharfrichter Mar 09 '17 at 09:12
  • 1
    Also remember that `java.io.tmpdir` is a **local** directory -- plain Java code cannot use HDFS! – Samson Scharfrichter Mar 09 '17 at 09:13

1 Answers1

2

replace this particular configuration in your hive-site.xml

<value>${system:java.io.tmpdir}/${hive.session.id}_resources</value>

replace with

<property>
 <name>hive.downloaded.resources.dir</name>
  <!--
     <value>${system:java.io.tmpdir}/${hive.session.id}_resources</value>
   -->
  <value>/home/hduser/hive/tmp/${hive.session.id}_resources</value>
  <description>Temporary local directory for added resources in the remote file system.</description>
</property> 
andani
  • 414
  • 1
  • 9
  • 28