if(df.count()== 0){
System.out.println("df is an empty dataframe");
}
The above is a way to check if a DataFrame is empty or not without getting a null pointer exception.
Is there any other best way to do so in Spark as I am worried that if the DataFrame df gets millions of records, the above statement will be taking a long time to get executed.