56

I am new to hadoop distributed file system, I have done complete installation of hadoop single node on my machine.but after that when i am going to upload data to hdfs it give an error message Permission Denied.

Message from terminal with command:

hduser@ubuntu:/usr/local/hadoop$ hadoop fs -put /usr/local/input-data/ /input
put: /usr/local/input-data (Permission denied)

hduser@ubuntu:/usr/local/hadoop$ 

After using sudo and adding hduser to sudouser:

hduser@ubuntu:/usr/local/hadoop$ sudo bin/hadoop fs -put /usr/local/input-data/ /inwe
put: org.apache.hadoop.security.AccessControlException: Permission denied: user=root, access=WRITE, inode="":hduser:supergroup:rwxr-xr-x

hduser@ubuntu:/usr/local/hadoop$ 
Roman Nikitchenko
  • 12,800
  • 7
  • 74
  • 110
Vignesh Prajapati
  • 2,320
  • 3
  • 28
  • 38
  • 1
    do you have access rights to the directory - are you using sudo? – ali haider Jul 21 '12 at 16:00
  • Yes,after using sudo,,,,hduser@ubuntu:/usr/local/hadoop$ sudo bin/hadoop fs -put /usr/local/input-data/ /inwe put: org.apache.hadoop.security.AccessControlException: Permission denied: user=root, access=WRITE, inode="":hduser:supergroup:rwxr-xr-x hduser@ubuntu:/usr/local/hadoop$ – Vignesh Prajapati Jul 21 '12 at 17:17
  • In my case, it was because I was trying to download files in a location in my filesystem where I did not have permissions. – optimist Jun 18 '15 at 14:08

6 Answers6

77

I solved this problem temporary by disabling the dfs permission.By adding below property code to conf/hdfs-site.xml

<property>
  <name>dfs.permissions</name>
  <value>false</value>
</property>
gsamaras
  • 71,951
  • 46
  • 188
  • 305
Vignesh Prajapati
  • 2,320
  • 3
  • 28
  • 38
56

I had similar situation and here is my approach which is somewhat different:

 HADOOP_USER_NAME=hdfs hdfs dfs -put /root/MyHadoop/file1.txt /

What you actually do is you read local file in accordance to your local permissions but when placing file on HDFS you are authenticated like user hdfs. You can do this with other ID (beware of real auth schemes configuration but this is usually not a case).

Advantages:

  1. Permissions are kept on HDFS.
  2. You don't need sudo.
  3. You don't need actually appropriate local user 'hdfs' at all.
  4. You don't need to copy anything or change permissions because of previous points.
Roman Nikitchenko
  • 12,800
  • 7
  • 74
  • 110
  • you can replace the first hdfs by your user, example : HADOOP_USER_NAME=your_user hdfs dfs -put /source /destination – mourad m Apr 30 '20 at 09:16
  • THis is not a solution, it is a work arround. You basically say "My user do not have rights ? Lets use the superuser"... – Itération 122442 Dec 17 '21 at 08:33
  • For 2014 this was a perfect solution and even now it has its target auditory. This was clearly unlinking your HDFS users from system ones. Yes, the fact you have OS user does not give you anything on hdfs. As I said, 'other authentication methods' may be used if you want to authenticate HSDP users. Just that time nobody was able to do anything other than perimeter security. Think about it like service account for backend processes (with real users authenticated on API level or so) - in small/medium company works 100%. – Roman Nikitchenko Dec 19 '21 at 10:07
19

You are experiencing two separate problems here:


hduser@ubuntu:/usr/local/hadoop$ hadoop fs -put /usr/local/input-data/ /input put: /usr/local/input-data (Permission denied)

Here, the user hduser does not have access to the local directory /usr/local/input-data. That is, your local permissions are too restrictive. You should change it.


hduser@ubuntu:/usr/local/hadoop$ sudo bin/hadoop fs -put /usr/local/input-data/ /inwe put: org.apache.hadoop.security.AccessControlException: Permission denied: user=root, access=WRITE, inode="":hduser:supergroup:rwxr-xr-x

Here, the user root (since you are using sudo) does not have access to the HDFS directory /input. As you can see: hduser:supergroup:rwxr-xr-x says only hduser has write access. Hadoop doesn't really respect root as a special user.


To fix this, I suggest you change the permissions on the local data:

sudo chmod -R og+rx /usr/local/input-data/

Then, try the put command again as hduser.

Donald Miner
  • 38,889
  • 8
  • 95
  • 118
6

I've solved this problem by using following steps

su hdfs
hadoop fs -put /usr/local/input-data/ /input
exit
Andrey Korneyev
  • 26,353
  • 15
  • 70
  • 71
java_dev
  • 992
  • 1
  • 8
  • 11
5

Start a shell as hduser (from root) and run your command

sudo -u hduser bash
hadoop fs -put /usr/local/input-data/ /input

[update] Also note that the hdfs user is the super user and has all r/w privileges.

Martin Tapp
  • 3,106
  • 3
  • 32
  • 39
4

For Hadoop 3.x, if you try to create a file on HDFS when unauthenticated (e.g. user=dr.who) you will get this error.

It is not recommended for systems that need to be secure, however if you'd like to disable file permissions entirely in Hadoop 3 the hdfs-site.xml setting has changed to:

<property>
  <name>dfs.permissions.enabled</name>
  <value>false</value>
</property>

https://hadoop.apache.org/docs/stable/hadoop-project-dist/hadoop-hdfs/hdfs-default.xml

Alex W
  • 37,233
  • 13
  • 109
  • 109