0

I am trying to execute a python script from a Spark scala job in cluster mode like shown below.

import scala.sys.process._

Process("sudo -n python helloWorld.py").!!

I get "sudo: a password is required" message.

I tried setting NOPASSWD as explained in link for the user by adding a line at the end of the file using 'sudo visudo' like below:

<username> ALL=(ALL) NOPASSWD: ALL

However it did not work. I still get the same error. What could be wrong?

vijay
  • 1,203
  • 1
  • 13
  • 25
  • are you sure that that job is running with the correct user? try running the command `whoami` to get the actual user – lev Oct 22 '18 at 11:01

2 Answers2

1

I had this problem with using dmidecode in my python program which use 'sudo' for running some command. I ran dmidecode on docker and didn't have any error.

I think it's because of docker running in root mode and every package and requirement for my python program set in root mode. But in 'ubuntu' I did all setting in "sudo user" which reference to root for execute 'sudo' command and my root user didn't any setting.

As mentioned, my submit was:

sudo ~/spark/bin/spark-submit --name tets --master spark://10.28.10.9:5050 --executor-cores 4 
                              --executor-memory 6G --files use.npy --py-files a.zip main.py

but in docker was:

~/spark/bin/spark-submit --name tets --master spark://10.28.10.9:5050 --executor-cores 4 
                         --executor-memory 6G --files use.npy --py-files a.zip main.py

I hope this will help you

barbsan
  • 3,418
  • 11
  • 21
  • 28
r.a.shehni
  • 350
  • 1
  • 4
  • 11
  • are you sure that your output of 'sudo whoami' command is 'root' and set '%sudo ALL=(ALL:ALL) NOPASSWD: ALL' on visudo? the last set NOPASSWD for all sudo user. – r.a.shehni Apr 25 '19 at 07:07
0

Is that process is start from spark driver? You should run spark application as sudo user.

Juhong Jung
  • 101
  • 1
  • 7