2
try:
    df1=spark.read.format("delta").load("/abc")
except Exception as ex:
  print(ex)

its giving exception as '/abc' is not a Delta table.;

I want to perform a specific operation when the file is not found. Is there a class already available to catch it separately like FileNoTFoundException - which doesn't seem to work here.

Alex Ott
  • 80,552
  • 8
  • 87
  • 132
drama
  • 23
  • 1
  • 8

1 Answers1

1

In pyspark 3.x, a nonexisting table will throw the generic pyspark.sql.utils.AnalysisException.
This is translated from the Java/Scala layer. Unfortunately there doesn't exist an exception specific to "File Not Found".

from pyspark.sql.utils import AnalysisException

try:
    spark.read.format("delta").load("/abc")
except AnalysisException as ex:
    print(ex)
Kristian
  • 467
  • 3
  • 7