-
-
Notifications
You must be signed in to change notification settings - Fork 18.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Can't read in 1GB data #2839
Comments
can you give me a backtrace with gdb?
then enter |
Thanks for helping me. This is what I got: This GDB was configured as "x86_64-apple-darwin"...Reading symbols for shared libraries . warning: Could not find object file "/Users/builder/work/Python-2.7.3/libpython2.7.a(acceler.o)" - no debug information available for "Parser/acceler.c". (I got a lot of warnings like these) .. done (gdb) r Program received signal SIGTRAP, Trace/breakpoint trap. |
Can you please try a (free) packaged python distro like EPDFree or Anaconda CE to see if it's a problem with your build of Python? You can pass |
Did you have a chance to look any further into this? |
carrier lost. |
I have a similar issue with a 1.67 GB file. I've tried engine='python' but it eats all the RAM (16GB). I've left it 10 minutes and it didn't finish. |
Any update on this? |
this is a long closed issue |
I load a 1.4G json and parse it into list of dicts and create a large DataFrame, I can save to the csv by |
|
Hello. I have an 1.2 GB csv (created using MatLab) and now I want to load it using pandas. If I do not specify |
I am trying to repeat the code in:
http://wesmckinney.com/blog/?p=635
But I can't read in the data. I have more than 4GB memory available (with 8GB memory in total), but still has the following problem:
In [2]: data = pd.read_csv("P00000001-ALL.csv", index_col=False)
Python(9081,0xacb01a28) malloc: *** mmap(size=24051712) failed (error code=12)
*** error: can't allocate region
*** set a breakpoint in malloc_error_break to debug
The text was updated successfully, but these errors were encountered: