4

Right now, I have to use df.count > 0 to check if the data frame is empty or not. But it is kind of inefficient. Is there any better way to do that.

Thanks.

PS: I want to check if it's empty so that I only save the data frame if it's not empty

How to check if spark dataframe is empty - this question tell me a way in scala, I am looking for a way in pyspark.

prudhvi Indana
  • 789
  • 7
  • 19
  • 2
    Exactly the same way, up to single pair of parenthesis `df.rdd.isEmpty()` – zero323 Jan 29 '18 at 21:28
  • 1
    thank you very much, I was trying ```df.head(1).isEmpty``` and was worried that it was not working. thank you very much @user6910411 – prudhvi Indana Jan 29 '18 at 21:33
  • Also see [this comment](https://stackoverflow.com/questions/32707620/how-to-check-if-spark-dataframe-is-empty#comment81790320_43383369) for another approach. – zero323 Jan 29 '18 at 21:34

0 Answers0