I have a code
val count = spark.read.parquet("data.parquet").select("foo").where("foo > 3").count
I'm interested if spark is able to push down filter somehow and read from parquet file only values satisfying where
condition. Can we avoid full scan in this case?