-
Notifications
You must be signed in to change notification settings - Fork 841
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Invalid rows being written in transaction_*.csv files #28
Comments
Thanks for reporting this. I'll try to reproduce it:
|
Thanks for checking it out, I think your idea in number 4 may actually be it.
|
Looks like that fixed it. Thanks for the help! |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Earlier I tried running export_all.sh and am receiving rows that have too many columns. My guess is that the files aren't being properly handled, resulting in erroneous rows being written.
An example I'm seeing of an invalid row:
It looks like it should be:
It seems that transaction
0x39a53f3f5d5c973134eb9eaecfed1a9d629ca5050bcd34e3ef249a242e539ac0
got cut off and the next row started writing too early.The text was updated successfully, but these errors were encountered: