3

I have a Spark 2.1 job where I maintain multiple Dataset objects/RDD's that represent different queries over our underlying Hive/HDFS datastore. I've noticed that if I simply iterate over the List of Datasets, they execute one at a time. Each individual query operates in parallel, but I feel that we are not maximizing our resources by not running the different datasets in parallel as well.

There doesn't seem to be a lot out there regarding doing this, as most questions appear to be around parallelizing a single RDD or Dataset, not parallelizing multiple within the same job.

Is this inadvisable for some reason? Can I just use a executor service, thread pool, or futures to do this?

Thanks!

Brian
  • 857
  • 2
  • 12
  • 25
  • you can find multiple questions and answers in stackoverflow itself for example https://stackoverflow.com/questions/31757737/how-to-run-multi-threaded-jobs-in-apache-spark-using-scala-or-python and https://stackoverflow.com/questions/30214474/how-to-run-multiple-jobs-in-one-sparkcontext-from-separate-threads-in-pyspark and there are a lot of materials explaining how to do them in the web as well – Ramesh Maharjan Feb 17 '18 at 12:14
  • yes you can do this, the easiest way is to use scala's parallel collection – Raphael Roth Feb 17 '18 at 20:39
  • 1
    @RameshMaharjan Upon review - yes those questions are relevant, but without understanding that is the question I should be asking, it's hard to find those answers :). – Brian Feb 18 '18 at 02:50

1 Answers1

3

Yes you can use multithreading in the driver code, but normally this does not increase performance, unless your queries operate on very skewed data and/or cannot be parallelized well enough to fully utilize the resources.

You can do something like that:

val datasets : Seq[Dataset[_]] = ???

datasets
  .par // transform to parallel Seq
  .foreach(ds => ds.write.saveAsTable(...) 
Raphael Roth
  • 26,751
  • 15
  • 88
  • 145
  • i have multiple dataframes that iam reading them from the sql server and how do i have it to run parallely to create parquet files for each DFs ? – Sundeep Pidugu Oct 31 '18 at 11:11