Just wonder if anyone is aware of this warning info
18/01/10 19:52:56 WARN SharedInMemoryCache: Evicting cached table partition metadata from memory due to size constraints
(spark.sql.hive.filesourcePartitionFileCacheSize = 262144000 bytes). This may impact query planning performance
I've seen this a lot when trying to load some big dataframe with many partitions from S3 into spark.
It never really causes any issues to the job, just wonder what is the use of that config property and how to tune it properly.
Thanks