15

Is it possible to check the version of Databricks Runtime in Azure?

Krzysztof Słowiński
  • 6,239
  • 8
  • 44
  • 62

3 Answers3

16

In Scala:

dbutils.notebook.getContext.tags("sparkVersion")

In Python:

from pyspark.sql import SparkSession
spark = SparkSession.builder.getOrCreate()

spark.conf.get("spark.databricks.clusterUsageTags.sparkVersion")

Is giving you the Databricks runtime and Scala version back, e. g.: 5.0.x-scala2.11 .

jdhao
  • 24,001
  • 18
  • 134
  • 273
Hauke Mallow
  • 2,887
  • 3
  • 11
  • 29
13

Databricks Runtime is the set of core components that run on the clusters managed by Azure Databricks. It includes Apache Spark but also adds a number of components and updates that substantially improve the usability, performance, and security of big data analytics.

You can choose from among many supported runtime versions when you create a cluster.

enter image description here

If you want to know the version of Databricks runtime in Azure after creation:

Go to Azure Data bricks portal => Clusters => Interactive Clusters => here you can find the run time version.

enter image description here

For more details, refer "Azure Databricks Runtime versions".

Hope this helps.

CHEEKATLAPRADEEP
  • 12,191
  • 1
  • 19
  • 42
2
print (spark.version)

worked for me

Eugene Lycenok
  • 603
  • 6
  • 14
  • sorry ... this is not runtime version ... but that helped me at the time .. didn't know the reputation decreases after you remove an answer :) – Eugene Lycenok Mar 26 '21 at 07:40