When a new pyspark application is started it creates a nice web UI with tabs for Jobs, Stages, Executors, etc. If I go to Executors tab I can see the full list of executors and some information about each executor - such as number of cores, storage memory used vs total, etc.
My question is if I can somehow access same information (or at least part of it) from the application itself programmatically, e.g. with something looking like spark.sparkContext.<function_name_to_get_info_about_executors>()
?
I've found some workaround with doing url request in a way similar to webUI, but I think that maybe I'm missing a simpler solution.
I'm using Spark 3.0.0