72

I'm trying to store 5000 data elements on an array. This 5000 elements are stored on an existinng file (therefore it's not empty).

But I'm getting an error.

IN:

def array():

    name = 'puntos.df4'

    m = open(name, 'rb')
    v = []*5000

    m.seek(-5000, io.SEEK_END)
    fp = m.tell()
    sz = os.path.getsize(name)

    while fp < sz:
        pt = pickle.load(m)
        v.append(pt)

    m.close()
    return v

OUT:

line 23, in array
pt = pickle.load(m)
_pickle.UnpicklingError: invalid load key, ''.
desertnaut
  • 57,590
  • 26
  • 140
  • 166
Xcecution
  • 937
  • 1
  • 9
  • 14
  • 2
    maybe i'm missing something, but it looks like you're assuming each value has a size of a single byte, why do you think this is guaranteed? and why are you trying to unpickle individual values manually? was the file created using the pickle module? – yurib Oct 10 '15 at 02:56
  • Right, i didn't notice but if I remove the "m.seek(-5000, io.SEEK_END)" part i got an EOFError. I thought that solved it but now you mention that I'm more confused. Should I edit the Question? – Xcecution Oct 10 '15 at 03:02
  • Oh and yes, the file was created using the dump() function from the pickle module – Xcecution Oct 10 '15 at 03:14

11 Answers11

28

pickling is recursive, not sequential. Thus, to pickle a list, pickle will start to pickle the containing list, then pickle the first element… diving into the first element and pickling dependencies and sub-elements until the first element is serialized. Then moves on to the next element of the list, and so on, until it finally finishes the list and finishes serializing the enclosing list. In short, it's hard to treat a recursive pickle as sequential, except for some special cases. It's better to use a smarter pattern on your dump, if you want to load in a special way.

The most common pickle, it to pickle everything with a single dump to a file -- but then you have to load everything at once with a single load. However, if you open a file handle and do multiple dump calls (e.g. one for each element of the list, or a tuple of selected elements), then your load will mirror that… you open the file handle and do multiple load calls until you have all the list elements and can reconstruct the list. It's still not easy to selectively load only certain list elements, however. To do that, you'd probably have to store your list elements as a dict (with the index of the element or chunk as the key) using a package like klepto, which can break up a pickled dict into several files transparently, and enables easy loading of specific elements.

Saving and loading multiple objects in pickle file?

Community
  • 1
  • 1
Mike McKerns
  • 33,715
  • 8
  • 119
  • 139
25

This may not be relevant to your specific issue, but I had a similar problem when the pickle archive had been created using gzip.

For example if a compressed pickle archive is made like this,

import gzip, pickle
with gzip.open('test.pklz', 'wb') as ofp:
    pickle.dump([1,2,3], ofp)

Trying to open it throws the errors

 with open('test.pklz', 'rb') as ifp:
     print(pickle.load(ifp))
Traceback (most recent call last):
  File "<stdin>", line 2, in <module>
_pickle.UnpicklingError: invalid load key, ''.

But, if the pickle file is opened using gzip all is harmonious

with gzip.open('test.pklz', 'rb') as ifp:
    print(pickle.load(ifp))

[1, 2, 3]
mishaF
  • 7,934
  • 9
  • 30
  • 34
  • Hmmm. Can you look at [this](https://stackoverflow.com/questions/64220956/javascript-blob-to-download-a-binary-file-creating-corrupted-files?noredirect=1#comment113563699_64220956) question? There is a similar error, but the circumstances are different. – Mooncrater Oct 06 '20 at 08:24
25

I solved my issue by:

  • Remove the cloned project
  • Install git lfs: sudo apt-get install git-lfs
  • Set up git lfs for your user account: git lfs install
  • Clone the project again.
desertnaut
  • 57,590
  • 26
  • 140
  • 166
Ahmad AlMughrabi
  • 1,612
  • 17
  • 28
  • 1
    Just to add a comment. I downloaded a large file from github -- and encounter the error message "_pickle.UnpicklingError: invalid load key". Later I found the large file is broken. I need to download it again and make sure the `sha256sum` is the same – Ray Mar 28 '22 at 00:56
  • Very good observation, thanks for sharing! – Ahmad AlMughrabi Mar 28 '22 at 11:07
  • 1
    True for machine learning models that are being downloaded from HuggingFace or other resource. – analytical_prat Dec 05 '22 at 13:44
  • Thanks for the tip about `git lfs`! In my case, `sudo apt-get install git-lfs`, `git lfs install`, and `git lfs pull` was enough. – Daniel Tchoń Aug 29 '23 at 04:51
13

If you transferred these files through disk or other means, it is likely they were not saved properly.

foladev
  • 332
  • 2
  • 8
  • 2
    This happened to me. I was sharing memory between two processes and used pickle to continuously read/write data from/to that memory. Unfortunately, I had forgotten to use a lock, so I ended up having a race condition where `pickle.loads()` sometimes failed because the data was corrupted (i.e. it was read while it was being overridden by the other process). – balu Sep 16 '20 at 16:51
4

I received a similar error while loading a pickled sklearn model. The problem was that the pickle is created via sklearn.externals.joblib and i was trying to load it via standard pickle library. Using joblib has solved my problem.

isilpekel
  • 121
  • 1
  • 8
3

I am not completely sure what you're trying to achieve by seeking to a specific offset and attempting to load individual values manually, the typical usage of the pickle module is:

# save data to a file
with open('myfile.pickle','wb') as fout:
    pickle.dump([1,2,3],fout)

# read data from a file
with open('myfile.pickle') as fin:
    print pickle.load(fin)

# output
>> [1, 2, 3]

If you dumped a list, you'll load a list, there's no need to load each item individually.

you're saying that you got an error before you were seeking to the -5000 offset, maybe the file you're trying to read is corrupted.

If you have access to the original data, I suggest you try saving it to a new file and reading it as in the example.

yurib
  • 8,043
  • 3
  • 30
  • 55
  • The file contains 5000 lists. I was trying to store each list in every component of the array. – Xcecution Oct 10 '15 at 03:31
  • Hi, I could solve the problem. I'm not sure how, but I just removed the "fp" variable and the I put "while m.tell() < sz:" instead of "while fp < sz:". Thank you anyway :), and if you know the reason of this "solution" I would be thankful if you could explain it to me. – Xcecution Oct 10 '15 at 03:59
1

I had a similar error but with different context when I uploaded a *.p file to Google Drive. I tried to use it later in a Google Colab session, and got this error:

    1 with open("/tmp/train.p", mode='rb') as training_data:
----> 2     train = pickle.load(training_data)
UnpicklingError: invalid load key, '<'.

I solved it by compressing the file, upload it and then unzip on the session. It looks like the pickle file is not saved correctly when you upload/download it so it gets corrupted.

carloslme
  • 11
  • 3
  • 3
    I also had this issue. It turned out that the file that was meant to be downloaded from Dropbox didn't exist anymore, and that `<` is part of the HTML website downloaded instead, saying something like "Sorry, this file has been deleted". – fepegar Dec 08 '20 at 19:21
0

I just encountered that issue which was initiated by the bad pickle file (not fully copied).

My solution: Check the pickle file status (corrupted or not).

  • Your answer could be improved with additional supporting information. Please [edit] to add further details, such as citations or documentation, so that others can confirm that your answer is correct. You can find more information on how to write good answers [in the help center](/help/how-to-answer). – Community Dec 01 '21 at 06:50
0

In my case, I ran into this issue due to multiple processes trying to read from the same pickled file. The first of these actually creates a pickle (write operation) and some quick threads start reading from it too soon. Just by retrying the read when catching these 2 errors EOFError, UnpicklingError I don't see these errors anymore

Bostone
  • 36,858
  • 39
  • 167
  • 227
-1

Pickling error - _pickle.UnpicklingError: invalid load key, '<'.
This kind of error comes when Weights are complete or some problem with the Weights/ Pickle file because of which UnPickling of weights giving Error.

  • 1
    Your answer could be improved with additional supporting information. Please [edit] to add further details, such as citations or documentation, so that others can confirm that your answer is correct. You can find more information on how to write good answers [in the help center](/help/how-to-answer). – Community Jul 25 '22 at 23:05
-2
  1. Close the opened file

    filepath = 'model_v1.pkl' with open(filepath, 'rb') as f: p = cPickle.Unpickler(f) model = p.load() f.close()

  2. If step 1 doesn't work; restart the session

DSBLR
  • 555
  • 5
  • 9