99

I have a class that serves players in a game, creates them and other things.

I need to save these player objects in a file to use it later. I've tried the pickle module but I don't know how to save multiple objects and again loading them? Is there a way to do that or should I use other classes such as lists and save and load my objects in a list?

Is there a better way?

hamidfzm
  • 4,595
  • 8
  • 48
  • 80

8 Answers8

155

Two additions to Tim Peters' accepted answer.

First, you need not store the number of items you pickled separately if you stop loading when you hit the end of the file:

def loadall(filename):
    with open(filename, "rb") as f:
        while True:
            try:
                yield pickle.load(f)
            except EOFError:
                break

items = loadall(myfilename)

This assumes the file contains only pickles; if there's anything else in there, the generator will try to treat whatever else is in there as pickles too, which could be dangerous.

Second, this way, you do not get a list but rather a generator. This will load only one item into memory at a time, which is useful if the dumped data is very large -- one possible reason why you may have wanted to pickle multiple items separately in the first place. You can still iterate over items with a for loop as if it were a list.

user2357112
  • 260,549
  • 28
  • 431
  • 505
Lutz Prechelt
  • 36,608
  • 11
  • 63
  • 88
  • 8
    This should be the top answer – Kristopher Wagner Aug 24 '16 at 06:05
  • 3
    Just be aware that calling `load(myfilename)` does not actually load the data or read from the file until you iterate over the result. If you want to load them immediately, use something like `list(load(myfilename))` or a `for` loop. – drevicko Aug 17 '17 at 10:13
  • 2
    Will this approach not leave the file handle open until the generator happens to be garbage collected, leading to potential locking issues? To solve this, should we put the `yield` *outside* the `with open()` block? Granted this leads to unnecessary reads to iterate through the pickle file, but I think I'd prefer this to dangling file handles. Unless we are sure this method will always be called quickly to EOF, and we close the file when the end of the file is reached. (But if we're bothering to yield individual elements it is probably because we don't need to unpickle all objects in a file.) – Chris May 24 '18 at 09:49
  • 5
    @Chris: If the iterator is used to its end, the `with open` will terminate and properly close the file. If it may not be used to its end, we will often not care about the open file. If it may not be used to its end _and_ we don't like the open file, then, yes, the above construction is not the best way to go. – Lutz Prechelt May 25 '18 at 14:49
  • IMO, we don't need a generator. Loading 1 item at a time is done **by `pickle.load`**, not by the generator, isn't it? As Chris and Lutz mentioned, the `loadall` method is supposed to be used until EOF because of **closing**, but if that's the case, why do we use a generator in the first place? :) – starriet Sep 16 '21 at 11:22
  • 1
    @starriet: because it saves the caller the hassle of opening and closing a file themselves and so allows for simple and idiomatic code at the place where the pickle items are used. – Lutz Prechelt Sep 16 '21 at 11:34
  • @starriet - also because some APIs require that a generator be passed as an argument. – Dan Nissenbaum Nov 04 '21 at 04:29
108

Using a list, tuple, or dict is by far the most common way to do this:

import pickle
PIK = "pickle.dat"

data = ["A", "b", "C", "d"]
with open(PIK, "wb") as f:
    pickle.dump(data, f)
with open(PIK, "rb") as f:
    print pickle.load(f)

That prints:

['A', 'b', 'C', 'd']

However, a pickle file can contain any number of pickles. Here's code producing the same output. But note that it's harder to write and to understand:

with open(PIK, "wb") as f:
    pickle.dump(len(data), f)
    for value in data:
        pickle.dump(value, f)
data2 = []
with open(PIK, "rb") as f:
    for _ in range(pickle.load(f)):
        data2.append(pickle.load(f))
print data2

If you do this, you're responsible for knowing how many pickles are in the file you write out. The code above does that by pickling the number of list objects first.

Tim Peters
  • 67,464
  • 13
  • 126
  • 132
  • Thanks I have your idea but I thought for multiple list objects it may cause memory issues & I decided to save each player in a separate file but do you think listing pickle objects my cause memory problems? – hamidfzm Dec 22 '13 at 10:54
  • 2
    Don't have enough info. How many players? How big is each player's pickle? How much RAM is available? If you have a great many players, it would be best to incorporate a database and store pickles in that (instead of inventing your own database, one painful step at a time). – Tim Peters Dec 22 '13 at 15:06
  • 1
    Why do all pickle examples always use binary mode? Binary file writing is one frontier my work has not yet broached whatsoever...I know nothing about it or why anyone uses it anywhere. – temporary_user_name Feb 02 '14 at 09:42
  • 5
    @Aerovistae binary mode is used because Windows will mess with end-of-line characters in text mode. – compie Oct 07 '16 at 11:31
29

Try this:

import pickle

file = open('test.pkl','wb')
obj_1 = ['test_1', {'ability', 'mobility'}]
obj_2 = ['test_2', {'ability', 'mobility'}]
obj_3 = ['test_3', {'ability', 'mobility'}]

pickle.dump(obj_1, file)
pickle.dump(obj_2, file)
pickle.dump(obj_3, file)

file.close()

file = open('test.pkl', 'rb')
obj_1 = pickle.load(file)
obj_2 = pickle.load(file)
obj_3 = pickle.load(file)
print(obj_1)
print(obj_2)
print(obj_3)
file.close()
Kos
  • 4,890
  • 9
  • 38
  • 42
N.S
  • 1,363
  • 15
  • 18
12

If you're dumping it iteratively, you'd have to read it iteratively as well.

You can run a loop (as the accepted answer shows) to keep unpickling rows until you reach the end-of-file (at which point an EOFError is raised).

data = []
with open("data.pickle", "rb") as f:
    while True:
        try:
            data.append(pickle.load(f))
        except EOFError:
            break

Minimal Verifiable Example

import pickle

# Dumping step
data = [{'a': 1}, {'b': 2}]
with open('test.pkl', 'wb') as f:
    for d in data:
        pickle.dump(d, f)

# Loading step
data2 = []
with open('test.pkl', 'rb') as f:
    while True:
        try:
            data2.append(pickle.load(f))
        except EOFError:
            break

data2
# [{'a': 1}, {'b': 2}]

data == data2
# True

Of course, this is under the assumption that your objects have to be pickled individually. You can also store your data as a single list of object, then use a single pickle/unpickle call (no need for loops).

data = [{'a':1}, {'b':2}]  # list of dicts as an example
with open('test.pkl', 'wb') as f:
    pickle.dump(data, f)

with open('test.pkl', 'rb') as f:
    data2 = pickle.load(f)

data2
# [{'a': 1}, {'b': 2}]
cs95
  • 379,657
  • 97
  • 704
  • 746
8

I will give an object-oriented demo using pickle to store and restore one or multi object:

class Worker(object):

    def __init__(self, name, addr):
        self.name = name
        self.addr = addr

    def __str__(self):
        string = u'[<Worker> name:%s addr:%s]' %(self.name, self.addr)
        return string

# output one item
with open('testfile.bin', 'wb') as f:
    w1 = Worker('tom1', 'China')
    pickle.dump(w1, f)

# input one item
with open('testfile.bin', 'rb') as f:
    w1_restore = pickle.load(f)
print 'item: %s' %w1_restore

# output multi items
with open('testfile.bin', 'wb') as f:
    w1 = Worker('tom2', 'China')
    w2 = Worker('tom3', 'China')
    pickle.dump([w1, w2], f)

# input multi items
with open('testfile.bin', 'rb') as f:
    w_list = pickle.load(f)

for w in w_list:
    print 'item-list: %s' %w

output:

item: [<Worker> name:tom1 addr:China]
item-list: [<Worker> name:tom2 addr:China]
item-list: [<Worker> name:tom3 addr:China]
Lyfing
  • 1,778
  • 1
  • 19
  • 20
0

It's easy if you use klepto, which gives you the ability to transparently store objects in files or databases. It uses a dict API, and allows you to dump and/or load specific entries from an archive (in the case below, serialized objects stored one entry per file in a directory called scores).

>>> import klepto
>>> scores = klepto.archives.dir_archive('scores', serialized=True)
>>> scores['Guido'] = 69 
>>> scores['Fernando'] = 42
>>> scores['Polly'] = 101
>>> scores.dump()
>>> # access the archive, and load only one 
>>> results = klepto.archives.dir_archive('scores', serialized=True)
>>> results.load('Polly')
>>> results
dir_archive('scores', {'Polly': 101}, cached=True)
>>> results['Polly']
101
>>> # load all the scores
>>> results.load()
>>> results['Guido']
69
>>>
Mike McKerns
  • 33,715
  • 8
  • 119
  • 139
0

Here is how to dump two (or more dictionaries) using pickle, and extract it back:

import pickle

dict_1 = {1: 'one', 2: 'two'}
dict_2 = {1: {1: 'one'}, 2: {2: 'two'}}

F = open('data_file1.pkl', 'wb')
pickle.dump(dict_1, F)
pickle.dump(dict_2, F)
F.close()

=========================================

import pickle

dict_1 = {1: 'one', 2: 'two'}
dict_2 = {1: {1: 'one'}, 2: {2: 'two'}}

F = open('data_file1.pkl', 'rb')
G = pickle.load(F)
print(G)
H = pickle.load(F)
print(H)
F.close()
0

Suppose we have saved objects in the file of an Employee class. Here is the code to read all objects, one by one, from file:

 e = Employee()    

with open(filename, 'rb') as a:
    while True:
        try:
            e = pickle.load(a)
            e.ShowRecord()
        except EOFError:
            break    
Osman Khalid
  • 778
  • 1
  • 7
  • 22