32

Spark Dataset 2.0 provides two functions createOrReplaceTempView and createGlobalTempView. I am not able to understand the basic difference between both functions.

According to API documents:

createOrReplaceTempView: The lifetime of this temporary view is tied to the [[SparkSession]] that was used to create this Dataset.
So, when I call sparkSession.close() the defined will be destroyed. is it true?

createGlobalTempView: The lifetime of this temporary view is tied to this Spark application.

when this type of view will be destroyed? any example. like sparkSession.close()?

Rahul Sharma
  • 5,614
  • 10
  • 57
  • 91

3 Answers3

48

The Answer to your questions is basically understanding the difference of a Spark Application and a Spark Session.

Spark application can be used:

  • for a single batch job
  • an interactive session with multiple jobs
  • a long-lived server continually satisfying requests
  • A Spark job can consist of more than just a single map and reduce.
  • A Spark Application can consist of more than one session

A SparkSession on the other hand is associated to a Spark Application:

  • Generally, a session is an interaction between two or more entities.
  • in Spark 2.0 you can use SparkSession
  • A SparkSession can be created without creating SparkConf, SparkContext or SQLContext, (they’re encapsulated within the SparkSession)

Global temporary views are introduced in Spark 2.1.0 release. This feature is useful when you want to share data among different sessions and keep alive until your application ends.Please see a shot sample I wrote to illustrate the use for createTempView and createGlobalTempView

object NewSessionApp {

  def main(args: Array[String]): Unit = {

    val logFile = "data/README.md" // Should be some file on your system
    val spark = SparkSession.
      builder.
      appName("Simple Application").
      master("local").
      getOrCreate()

    val logData = spark.read.textFile(logFile).cache()
    logData.createGlobalTempView("logdata")
    spark.range(1).createTempView("foo")

    // within the same session the foo table exists 
    println("""spark.catalog.tableExists("foo") = """ + spark.catalog.tableExists("foo"))
    //spark.catalog.tableExists("foo") = true

    // for a new session the foo table does not exists
    val newSpark = spark.newSession
    println("""newSpark.catalog.tableExists("foo") = """ + newSpark.catalog.tableExists("foo"))
    //newSpark.catalog.tableExists("foo") = false

    //both session can access the logdata table
    spark.sql("SELECT * FROM global_temp.logdata").show()
    newSpark.sql("SELECT * FROM global_temp.logdata").show()

    spark.stop()
  }
}
werner
  • 13,518
  • 6
  • 30
  • 45
Avi Chalbani
  • 842
  • 7
  • 11
  • @Avi Chalbani I am getting an error when I do the same , how to fix it ?df.createOrReplaceGlobalTempView("model_vals") Error : org.apache.spark.sql.catalyst.analysis.NoSuchTableException: Table or view 'model_vals' not found in database 'default'; – BdEngineer Dec 05 '18 at 11:31
  • Hi,should I use prefix of 'global_temp' if I want to use the table in SQL. – jlucky Apr 03 '19 at 07:29
  • Can I use the temp table or view like this `spark.sql("SELECT * FROM logdata").show()` Thanks. – jlucky Apr 03 '19 at 07:30
  • 2
    Hi, Everyone, Can anyone throw some light in which use case we may want to create a `newSession`? What is benefits of creating a new session? – Gaurav Khare Apr 12 '19 at 18:05
32
df.createOrReplaceTempView("tempViewName")
df.createGlobalTempView("tempViewName")

createOrReplaceTempView() creates or replaces a local temporary view with this dataframe df. Lifetime of this view is dependent to SparkSession class, is you want to drop this view :

spark.catalog.dropTempView("tempViewName")

or stop() will shutdown the session

self.ss = SparkSession(sc)
...
self.ss.stop()

createGlobalTempView() creates a global temporary view with this dataframe df. life time of this view is dependent to spark application itself. If you want to drop :

spark.catalog.dropGlobalTempView("tempViewName")

or stop() will shutdown

ss =  SparkContext(conf=conf, ......)
...
ss.stop()
user3190018
  • 890
  • 13
  • 26
Gökhan Ayhan
  • 1,184
  • 11
  • 12
  • based on my understanding both type of view will be destroyed when you call `sparkContext.stop()`. where application shutdown comes into picture? – Rahul Sharma Apr 12 '17 at 16:20
  • SparkSession class needs a SparkContext object, if sparkContext stops both of them will be destroyed.if you don't call stop() method your application may hang on air. You don't need SparkSession or SparkContext objects then stop them. So spark master node will be aware your application not consuming or using and cpu or resources. See: http://apache-spark-user-list.1001560.n3.nabble.com/SparkContext-stop-td17826.html – Gökhan Ayhan Apr 13 '17 at 05:22
  • 1
    Just to add to the awesome answer: Spark Session is equivalence to sqlContext in Pre Spark2.0 era. – human Mar 08 '18 at 05:14
  • Will the stop function also delete the hive databases/table created – vinayak_narune Dec 11 '18 at 17:23
4

createOrReplaceTempView has been introduced in Spark 2.0 to replace registerTempTable. CreateTempView creates an in-memory reference to the Dataframe in use. The lifetime for this depends on the spark session in which the Dataframe was created in. createGlobalTempView, on the other hand, allows you to create the references that can be used across spark sessions. So depending upon whether you need to share data across sessions, you can use either of the methods. By default, the notebooks in the same cluster share the same spark session, but there is an option to set up clusters where each notebook has its own session. So all it boils down to is that where do you create the data frame and where do you want to access it.

WolfBlunt
  • 61
  • 4