6

I have changed permission using hdfs command. Still it showing same error.

The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: -wx------

Java Program that I am executing.

import java.sql.SQLException;
import java.sql.Connection;
import java.sql.ResultSet;
import java.sql.Statement;
import java.sql.DriverManager;
import org.apache.hive.jdbc.HiveDriver;

public class HiveCreateDb {
   private static String driverName = "org.apache.hadoop.hive.jdbc.HiveDriver";

   public static void main(String[] args) throws Exception {
      // Register driver and create driver instance

         Class.forName(driverName);

/*  try {
  Class.forName(driverName);
} catch(ClassNotFoundException e) {
  print("Couldn't find Gum");

} */     // get connection

      Connection con = DriverManager.getConnection("jdbc:hive://", "", "");

      Statement stmt = con.createStatement();

      stmt.executeQuery("CREATE DATABASE userdb");
      System.out.println("Database userdb created successfully.");

      con.close();
   }
}

It is giving a runtime error for connecting hive.

Exception in thread "main" java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rwx------ enter image description here

Amar Banerjee
  • 4,992
  • 5
  • 34
  • 51

4 Answers4

5

Try this

hadoop fs -chmod -R 777 /tmp/hive/;

I had a similar issue while running a hive query, using the -R resolved it.

N00b Pr0grammer
  • 4,503
  • 5
  • 32
  • 46
Tamim Syed
  • 51
  • 2
  • No, it didn't work for me. The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rwx------ – Amar Banerjee Nov 04 '16 at 10:03
  • On my `Mac`, the above command fixes the permission issue but once I launch the `hive shell` and close it, the problem re-appears. I've found that launching `hive` shell removes the *write permission* from `/tmp/hive` for **'o' (others)** and **'g' (groups)**. While `beeline` shell doesn't suffer with this issue, the fact that `hive` shell exhibits this behaviour makes me think there's something wrong with the configuration on my PC – y2k-shubham Jan 23 '18 at 09:48
  • 1
    @y2k Hive CLI is deprecated in favor of beeline – OneCricketeer Feb 04 '18 at 16:26
3

Just to add to the previous answers, if your username is something like 'cloudera' (you could be using cloudera manager/cloudera quickstart as your implementation platform), you could do the following:

sudo -u hdfs hadoop fs -chmod -R 777 /tmp/hive/;

Remember that in hadoop, 'hdfs' is the superuser and not 'root' or 'cloudera'.

sdinesh94
  • 1,138
  • 15
  • 32
1

Don't do chmod (777)... The correct is (733):

Hive 0.14.0 and later: HDFS root scratch directory for Hive jobs, which gets created with write all (733) permission. For each connecting user, an HDFS scratch directory ${hive.exec.scratchdir}/ is created with ${hive.scratch.dir.permission}.

Try to do this with hdfs user:

    hdfs dfs -mkdir /tmp/hive
    hdfs dfs -chown hive /tmp/hive/$HADOOP_USER_NAME
    hdfs dfs -chmod 733 /tmp/hive/$HADOOP_USER_NAME
    hdfs dfs -mkdir /tmp/hive/$HADOOP_USER_NAME
    hdfs dfs -chown $HADOOP_USER_NAME /tmp/hive/$HADOOP_USER_NAME
    hdfs dfs -chmod 700 /tmp/hive/$HADOOP_USER_NAME

This works, instead you can change scratchdir path with (from hive):

set hive.exec.scratchdir=/somedir_with_permission/subdir...

more info: https://cwiki.apache.org/confluence/display/Hive/AdminManual+Configuration

Marc
  • 11
  • 1
1

We are executing spark job in local mode. That means there is no writable permission to the directory /tmp/hive in local (linux) machine.

So execute chmod -R 777 /tmp/hive. That solved my issue.

Referred from::: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rwx--------- (on Linux)

Harneet Singh
  • 2,328
  • 3
  • 16
  • 23