Data Frames in Apache spark use off heap memory for storing data. Whats the main purpose of using off heap memory? What I understand currently is it's beneficial to store large objects(mutable or immutable objects??) so that it doesn't require us to use a larger java heap space. Using large java heap space slows down the application because of how the java Garbage collector works..
The above is what I've understood.. Can someone please help me put together the pieces..