0

I am using Apache Spark in standalone mode under Ubuntu

I am trying to save a file to a location, that is on a nfs host machine.

The spark worker is started under a user that has permissions to the folder that I try to save to.

I changed the folder's permissions to 777 and still get errors - when creating a new folder under the nfs.

A folder is created, and then it can't create anything new in it.

the new folder has, again the old permissions not letting the spark executors write anything to it.

How can I fix this?

thebeancounter
  • 4,261
  • 8
  • 61
  • 109

1 Answers1

0

There was no problem with the user names, the issue was the nfs server, when connecting to it, i stayed in the same uid as in the client machine, which was not suitable for the owner uid in the nfs server, the fix is going to the /etc/exports file in the nfs host and adding this line /var/general/nfs *(rw,sync,no_root_squash,all_squash,anonuid=1000,anongid=1000,no_subtree_check)

which sets all incoming nfs connections under the correct uid to control the location

thebeancounter
  • 4,261
  • 8
  • 61
  • 109