-
Notifications
You must be signed in to change notification settings - Fork 612
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Encounter "Memory Error" when converting imagenet dataset #18
Comments
I ran into the same problem as well, the 244244 file was dumped okay with 7.5G and the 299299 pkl file was empty with 0B |
I saw the same issue. I separated the 224 and 299 dump processing loops and cleared variables that were no longer used. Still it dies in dump_pickle, which must be making another copy. |
same problem on 24G RAM windows PC with python 3.6.6 and torch 1.1.0 === I finished my job by following convert.py, thx to @jnorwood this new convert.py will takes about 16Gb memory. code
import os imagenet_urls = [ d = misc.load_pickle(args.in_file) ''' conver val224.pkl ''' ''' conver val229.pkl ''' if not os.path.exists(args.out_root): ` resultLoading pickle object from val224_compressed.pkl |
thanks @jnorwood, fixed, please check |
Hi,
When I was trying to using the Alexnet model, I first of all tried to follow your instruction to download val224_compressed.pkl and executed the command "python convert.py"
But when I was converting, it always come to the error message "Memory Error".
I am curious about how to deal with this issue, since I think the memory of the machine I used is big enough, which is 64 GB.
Thanks !
The text was updated successfully, but these errors were encountered: