I need to find the way to save all objects (or at least dataframes) to one place, outside of working directory. I assume Python keeps all objects in memory not on disk, so I'm looking for the way of exporting all objects from current session. It can be pickle, does not matter until you can read it to a different Python session
Asked
Active
Viewed 1,227 times
2 Answers
0
If you want to save a dataframe in another directory, try :
df.to_csv('foldername/filename.csv')

thehand0
- 1,123
- 4
- 14
0
If I understood your need, you want to backup your session.
If that is the case, here is a solution using pickle
. Kr.
First solution is:
import pickle
def is_picklable(obj):
try:
pickle.dumps(obj)
except Exception:
return False
return True
bk = {}
for k in dir():
obj = globals()[k]
if is_picklable(obj):
try:
bk.update({k: obj})
except TypeError:
pass
# to save session
with open('./your_bk.pkl', 'wb') as f:
pickle.dump(bk, f)
# to load your session
with open('./your_bk.pkl', 'rb') as f:
bk_restore = pickle.load(f)
***Second solution is with dill
. You might have error if in your workspace, there are some unpicklable objects ***:
import dill
dill.dump_session('./your_bk_dill.pkl')
#to restore session:
dill.load_session('./your_bk_dill.pkl')
Third option go with shelve
package:
import shelve
bk = shelve.open('./your_bk_shelve.pkl','n')
for k in dir():
try:
bk[k] = globals()[k]
except Exception:
pass
bk.close()
# to restore
bk_restore = shelve.open('./your_bk_shelve.pkl')
for k in bk_restore:
globals()[k]=bk_restore[k]
tmp[k] = bk_restore[k]
bk_restore.close()
Check and let's know about your trial.
Credits: The second and third solution are nearly a shameless copy/paste from those two links belows. I adapted the handling of errors as the original answer will lead to error for pickling of module.
dill solution
shelve solution

antoine
- 662
- 4
- 10
-
1I checked `dill` module and `shelve`. `Dill` suits me best as `shelve` doesn't dump DataFrame object. – kemot25 Jan 17 '21 at 17:45
-
Thanks for your feedback. `Dill` indeed is more versatile. In my hand, a small toy dataframe can be pickled though. May be your dataframe contains something more complex? – antoine Jan 18 '21 at 08:36
-
Also, a good point of `Shelve` is quite effective in memory saving (not load all variables back at once if not needed). If you have time, here is my [small post](https://antrg.medium.com/python-for-datascientist-quick-backup-for-everything-6d201a7e935d) on the subjet. – antoine Jan 18 '21 at 08:39
-
I take back my words. I can not backup session as `dill.dump_session` returns `TypeError: can't pickle _gdbm.gdbm objects`. The session was restored from `shelve` backup. One disadvantage of `shelve` is that you cannot compress it [Reduce Python shelve size](https://stackoverflow.com/q/53990501/14261102) – kemot25 Jan 18 '21 at 10:25