Let's say that I have a dataframe called spark_df in a Notebook called Notebook1 and I want to transfer it to a Notebook called Notebook2. Obviously I can't do "from Notebook1.ipynb import spark_df" and I can't convert it to csv because 1) it's too big and 2) I need a more direct approach.
I need to import it to another Notebook because after finishing processing and I try to do something, the kernel dies. So how can I import the spark_df to Notebook2 without converting it to csv?