0

I am trying to execute a tool (which requires spark-submit available in the path) inside a Docker container with Apache Spark (indeed to resolve this issue, in Dockerfile I added the instruction RUN echo "export PATH=$PATH:/spark/bin" >> ~/.bashrc).

If I connect to this Docker container (with the command sudo docker exec -it $spark_masterID bash, as suggested here), the tool work without any problem and I can see the typical Spark output.

But in my case I am interested in calling this tool without connecting to the container, indeed I have a script in this container which calls the tool. In order to call it I use the command:

sudo docker exec -it $spark_masterID bash /path/script.sh

but this time the execution fails, reporting this error: Make sure spark-submit is available in your path

What I am doing wrong?

Vzzarr
  • 4,600
  • 2
  • 43
  • 80
  • Check first answer [here](https://stackoverflow.com/questions/27093612/in-a-dockerfile-how-to-update-path-environment-variable). – LMC Jan 30 '18 at 19:44
  • Stack Overflow is a site for programming and development questions. This question appears to be off-topic because it is not about programming or development. See [What topics can I ask about here](http://stackoverflow.com/help/on-topic) in the Help Center. Perhaps [Super User](http://superuser.com/) or [Unix & Linux Stack Exchange](http://unix.stackexchange.com/) would be a better place to ask. – jww Jan 31 '18 at 00:24
  • thanks @LuisMuñoz it worked for me (after some editing)! I understand your point jww so I should remove this question? – Vzzarr Jan 31 '18 at 15:27

0 Answers0