-
Notifications
You must be signed in to change notification settings - Fork 177
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How do I fetch COPY output result ? #332
Comments
There's no direct API for Care to elaborate on your use-case so we can understand what you want to achieve to discuss how to make it possible? |
@mp911de My use-case is data replication. So It's possible that single COPY query will return 100s of GB of data. As result I need to parse response in streaming fashion. |
Data replication as in logical decode? How would you consume |
I read it as he just wants to consume the copy stream. Data replication is orthogonal to the request |
Dave is correct. I want to consume very long stream of binary data. Typically COPY output is either CSV or BINARY. So my client code will take of parsing it. What's also important is to know that the stream is fully consumed. For that I need to get If I were to modify existing code with minimal intrusion I'd probably have something like |
Is that something we could map onto
Since I'm not terribly familiar with |
Thank you, Mark. I'm afraid returning response in |
Sure, happy to continue the discussion over the actual code. Note that the driver is built with the assumption to decode an entire frame (message) (see |
No, copy is a stream it could be TB as mentioned. |
Does the server send multiple |
Good question I actually haven't looked. you are probably correct though it does probably send a frame |
Is there official way of getting COPY output result ? I see that
CopyData
messages are coming inPostgresqlResult.messages
but there is no way for me to get it becausePostgresqlResult.map
is hardcoded to get regular rows only.Thank you.
The text was updated successfully, but these errors were encountered: