3
#!/bin/sh
spark-shell
import org.apache.spark.sql.SparkSession
val url="jdbc:mysql://localhost:3306/slow_and_tedious"
val prop = new java.util.Properties
prop.setProperty("user",”scalauser”)
prop.setProperty("password","scalauser123")
val people = spark.read.jdbc(url,"sat",prop)

The above commands are used to make a connection between Mysql and Spark using JDBC. But instead of writing these commands everytime I thought of make a script but when I run the above script it throws this error.

enter image description here

Adarsh
  • 83
  • 1
  • 2
  • 9

3 Answers3

2

Create scala file named as test.scala with your code like below

import org.apache.spark.sql.SparkSession
val url="jdbc:mysql://localhost:3306/slow_and_tedious"
val prop = new java.util.Properties
prop.setProperty("user",”scalauser”)
prop.setProperty("password","scalauser123")
val people = spark.read.jdbc(url,"sat",prop)

Login to spark-shell using following command.

spark-shell --jars mysql-connector.jar

you can use following command to execute the code which you created above.

scala> :load /path/test.scala

shell script every time it launch sparkContext which takes more time to execute.

If you use above command it will just execute the code which is there in test.scala.

Since sparkContext will be loaded when you logging into spark-shell, time can be saved when you execute script.

Aravind Kumar Anugula
  • 1,304
  • 3
  • 14
  • 35
  • 1
    I really couldn't understand your answer. Seems You did not get my question. I want to write a unix script which when runs, executes the following commands. Hence opens spark-shell and makes a jdbc connection for mysql automatically without me executing each line again and again. – Adarsh Apr 17 '17 at 09:09
  • I got your question and I'm trying to say you don't need to go with shell script as you can save it as scala file and you can execute all steps at one go. I added more explanation which will help you to understand it better. Please let me know still if you can't understand, I will try another way to understand better. – Aravind Kumar Anugula Apr 17 '17 at 10:04
2

Try this,

Write your code in a filex.txt

In your unix shell script include the following

cat filex.txt | spark-shell

Seemingly, you cant push the script in Background (using &)

aminography
  • 21,986
  • 13
  • 70
  • 74
1

you can paste your script in a file,then execute

spark-shell < {your file name}
StSahana
  • 11
  • 2