1

I use spark-2.1 .Below is my code

    delta="insert overwrite table schema1.table1 select * from schema2.table2"

    try:
             spark.sql(delta)
    except Exception as e:
             spark.sql("drop table schema2.table2")
             print("Overall load failed for schema1.table1", e)

    sqlCtx.sql("drop table schema1.table1 ")

Below is what I am trying

To insert into table1 of schema1 from another table2 in another schema2 .

I am putting it in a try block , so that if it is successfull it will go out to except condition will drop the table and print the message Overall load failed for schema1.table1 .

Now the problem is whenever I execute the above statement it is dropping the table in the schema . pyspark is not being controlled by python's try and catch

I sense without going into try it is going into catch block and dropping

Please help in crossing this hurdle

Thanks in advance !

Xavier Guihot
  • 54,987
  • 21
  • 291
  • 190

0 Answers0