0

I create a temporary "country" table from dataframe whcih contains all the rows were : country_code_destination is null:

df_country=df.where(df.country_code_destination.isNull()).registerTempTable("country")

Then, I would like to export this temptable into a csv file :

%%sql -q -o qry_country
SELECT * FROM country

%%local
qry_country.to_csv("country_code_null.csv", sep=';')

But the problem is that in the "country_code_null.csv" file, i do not find the "country_code_destination" column. There is all the column except the "country_code_destination".

Any idea to help to resolve this problem please?

Thanks

Poisson
  • 1,543
  • 6
  • 23
  • 34
  • %%local and %%sql are not in the same environment. Try rebooting your notebook, and re-run your queries, you will figure out the problem – Steven Jun 25 '19 at 11:53
  • @Steven , always the same problem.. is there any other method to export to a csv file a spark dataframe? Thanks – Poisson Jun 25 '19 at 12:29
  • Possible duplicate of [How to export a table dataframe in PySpark to csv?](https://stackoverflow.com/questions/31385363/how-to-export-a-table-dataframe-in-pyspark-to-csv) – Ben.T Jun 25 '19 at 13:13

0 Answers0