Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Batch Data Ingestion Using pg_ivm #98

Open
salilsaifuddin opened this issue Aug 26, 2024 · 1 comment
Open

Batch Data Ingestion Using pg_ivm #98

salilsaifuddin opened this issue Aug 26, 2024 · 1 comment
Labels
question Further information is requested

Comments

@salilsaifuddin
Copy link

salilsaifuddin commented Aug 26, 2024

Is it possible to ingest data as a batch using pg_ivm? I am looking for a way to efficiently handle large volumes of data and would like to know if pg_ivm supports batch processing.

@ibhaskar2 ,@Jamal-B, @yugo-n , @tatsuo-ishii , @hanefi

Jamal-B pushed a commit to mediapart/pg_ivm that referenced this issue Aug 26, 2024
@yugo-n
Copy link
Collaborator

yugo-n commented Oct 14, 2024

pg_ivm cannot perform incremental maintenance efficiently for large write to a underlying table because the overhead of immediate maintenance would be large in this case.

If you would like to insert large data to a underlying table, I recommend to disable immediate maintenance by executing refesh_immv() function with with_data = false before inserting data. After that, call refresh_immv with with_data = true to refresh the view data and enable immediate maintenance. Whether the refresh takes longer time than the immediate maintenance or it takes shorter will depend on the size of data and the view definition, but it would be hard for now to predict it a priori.

@yugo-n yugo-n added the question Further information is requested label Oct 14, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants