-
Notifications
You must be signed in to change notification settings - Fork 173
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Maximum call stack size exceeded when decoding values #110
Comments
did you find any solution? |
I had this issue too and after some troubleshooting gave up. Instead, I was able to convert my large parquet file to json using this Rust project. https://github.com/jupiter/parquet2json |
I also had this issue with using repeated: true, and large amount of data. The issue is inside the rle and reader. Changing the code to use a safer array copy fixed the issue. exports.arrayCopy = function(dest, src) { |
I'm having this issue while trying to read a 1.7MB file. @jgold21 can you say a little more about how you fixed this issue? I can't see how to use your code in rel.js - but its probably a problem with my comprehension rather than your javascript :-D |
I'm having the same issue as well, with a file with 13049 rows, reading only one of the columns. The workaround by @jgold21 doesn't seem to apply, there is no such function in the codebase anymore. |
Hi, I'm trying to read a local file, approximate 1.8Gb with 18790733 rows, SNAPPY compression. On executing the following code in Node 12
is prints the row count, but throws this error on
cursor.next()
Would the file size or row count be too large for this to be processed? Alternatively, is there a way to stream the file to read/ decode one row at a time?
Thanks in advance,
The text was updated successfully, but these errors were encountered: