1

I want to execute update query in SQL using pyspark based some logic I am using. All I could find is documentation on how to read from SQL

BUT

there are no proper examples of executing update or create statement.

Alex Kulinkovich
  • 4,408
  • 15
  • 46
  • 50
  • df2 = spark.sql("SELECT field1 AS f1, field2 as f2 from table1") http://spark.apache.org/docs/2.1.0/api/python/pyspark.sql.html – Miguel Santos Oct 03 '18 at 16:04
  • I want to run update query not Select. I am doing comparisons between two data frames if I found a difference I need to update the value in SQL table – Karthik reddy Oct 03 '18 at 16:09
  • SQL doesnt support update: https://stackoverflow.com/questions/37517371/update-query-in-spark-sql – Miguel Santos Oct 03 '18 at 16:11
  • As @MiguelSantos pointed out, SQL doesn't support updates, so you need to refactor your query to a transformation. For example you could add a new column based on your logic, or you could overwrite an existing column. – Alex Oct 03 '18 at 16:27
  • 1
    thank you @MiguelSantos and Jaco. Guess I'll have to use sqlalchemy to update it. – Karthik reddy Oct 03 '18 at 16:30

0 Answers0