0

Consider there are few hive queries in a file, my moto is to run the file using hivecontext or sparkcontext

Using command line I can do that by hive -f 'filepath/filename' But I have to run it via code (hivecontext or sparkcontext) Can anybody help on this?

For a single query I can use:

sparkContext.SQL('query')

But I have to run a file which is having queries.

Brian Tompsett - 汤莱恩
  • 5,753
  • 72
  • 57
  • 129
Siva kumar
  • 11
  • 4
  • Load file as string? –  Nov 19 '16 at 15:55
  • 2
    Possible duplicate of [Spark : how to run spark file from spark shell](http://stackoverflow.com/questions/27717379/spark-how-to-run-spark-file-from-spark-shell) – Denny Lee Nov 19 '16 at 19:42

2 Answers2

0

You can do it using Spark/Scala:

queryFile = "path_of_your_file_queries"
Source.fromFile(queryFile, "utf-8").getLines().foreach(query => sparksql.sql(query))
0

You can run your hql file some where like the below :

val Test = sqlContext.sql(open("file.hql").read())
Sampat Kumar
  • 492
  • 1
  • 6
  • 14