-
-
Notifications
You must be signed in to change notification settings - Fork 18.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Unpickling of pre 0.15 gzip-compressed objects not possible #10966
Comments
http://pandas.pydata.org/pandas-docs/stable/io.html#io-pickle you would need to show what you are actually pickling and what versions this is backwards compatible back to 0.11 so not sure what the problem is |
You should be able to write the unzipped pickle to a file and use |
@jreback : Python 2.7.10, Pandas 0.14.1 (before update), Pandas 0.16.2 after update.The main problem is, that compatibility breaks if GZIP compression is used. The example with just @shoyer You are totally right, it was just a quick hack. Unzipping manually and reading the file afterwards does work and is apparently the only solution. Overall not a huge problem (and another reminder one should not use pickle as data storage) - but due to this combination, you need quite a bit of code and error handling to work around this when updating. The optimal solution would be if |
The issue is this one: #5924
|
Well, I have zero experience in that area but could be good start - will have a look if I find some time! |
gr8!. contribution docs are here |
I pickled and gzip_compressed DataFrames in a tuple pre-0.15
Pre-0.15 i would load them again like:
Now after updating pandas, I can't load them back anyme:
If I directly try pd.read_pickle, it can't decrompress the file properly:
Reading the decompressed file does not work too:
Is there any solution for this? This seems to be a serious issue for backwards compatibility if you need to downgrade pandas again to get the data back if there is no backup in other formats.
The text was updated successfully, but these errors were encountered: