3

I have some problems to understand the logical architecture in which I develop with Scala/Spark-shell and Hadoop environment.

For better describe the logical architecture, I drew a small schema:

enter image description here

As the figure shows, I have Eclipse installated on my personal PC, and I would like to run scala script from my PC to Hadoop in remote mode. Now I have the VPN connection, and I can process my scala program with PUtty from the shell. In practice, every time that I have to launch a Scala script, I transfer the file .scala from my pc to remote machine with WinSCP, so I lanch the program directly from the remote machine. Every time I have to tranfer the file making me work wasteful.

Now the question: is there a way to launch the script from my personal PC to remote cluster, without pass through to the PUtty?

Alessandro
  • 337
  • 1
  • 5
  • 18
  • you might need to look in to this http://stackoverflow.com/questions/37648426/how-to-submit-a-spark-job-on-a-remote-master-node-in-yarn-client-mode – Shankar Aug 01 '16 at 17:10

0 Answers0