My question is more related to memory management and GC in sprak internally.
If I will create a RDD, how long it will leave in my Executor memory.
# Program Starts
spark = SparkSession.builder.appName("").master("yarn").getOrCreate()
df = spark.range(10)
df.show()
# other Operations
# Program end!!!
- Will it be automatically deleted once my Execution finishes. If Yes, Is there any way to delete it manually during program execution.
- How and when Garbage collection called in Spark. Can we implement custom GC like JAVA program and use it in Spark.