1

I'm trying to convert a Spark application to use ORC output storage instead of Parquet format. After modifying my code, I have the below error when running the application on Windows:

> java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS
> should be writable. Current permissions are: rw-rw-rw-    at
> org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:612)
>   at
> org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:554)
>   at
> org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:508)
>   ... 73 more

First, i found strange it can't write with the rw-rw-rw- permissions but whatever, I tried to change this directory permissions with Hadoop's winutils as descibed here :

winutils.exe chmod -R 777 \tmp

But still the same error...

Moreover, I also tried to delete the directory and the app automaticaly rebuilt it with the 733 permissions but still says it can't write...

EDIT : This is not a duplicate of this question because my problem stands on Windows platform.

Dynamite
  • 389
  • 5
  • 17

0 Answers0