2

The below code was working for the older version and the version has changed the code is not working in databricks.

Latest Version :12.0 (includes Apache Spark 3.3.1, Scala 2.12)

dbutils.notebook.entry_point.getDbutils().notebook().getContext().tags()

what is the alteranative for this code?

Error:py4j.security.Py4JSecurityException: Method public scala.collection.immutable.Map com.databricks.backend.common.rpc.CommandContext.tags() is not whitelisted on class class com.databricks.backend.common.rpc.CommandContext

Venkatesh
  • 91
  • 1
  • 9

2 Answers2

2

You can get most of cluster info directly from Spark config:

%scala
val p = "spark.databricks.clusterUsageTags."
spark.conf.getAll
  .collect{ case (k, v) if k.startsWith(p) => s"${k.replace(p, "")}: $v" }
  .toList.sorted.foreach(println)

%python
p = "spark.databricks.clusterUsageTags."
conf = [f"{k.replace(p, '')}: {v}" for k, v in spark.sparkContext.getConf().getAll() if k.startswith(p)]
for l in sorted(conf): print(l)

[...]
clusterId: 0123-456789-0abcde1
clusterLastActivityTime: 1676449848620
clusterName: test
clusterNodeType: Standard_F4s_v2
[...]
Kombajn zbożowy
  • 8,755
  • 3
  • 28
  • 60
0

I have created the following cluster with DBS runtime of 12.0 as shown below:

enter image description here

  • The above command has given the output as required without any error:

enter image description here

  • But if it is the cluster information that you need, then you can use Clusters 2.0 API. The following code would work:


import requests
import json

my_json = {"cluster_id": spark.conf.get("spark.databricks.clusterUsageTags.clusterId")}    

auth = {"Authorization": "Bearer <your_access_token>"}

response = requests.get('https://<workspace_url>/api/2.0/clusters/get', json = my_json, headers=auth).json()
print(response)

enter image description here

Saideep Arikontham
  • 5,558
  • 2
  • 3
  • 11