-
Notifications
You must be signed in to change notification settings - Fork 27
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Batch Data Ingestion Using pg_ivm #98
Comments
pg_ivm cannot perform incremental maintenance efficiently for large write to a underlying table because the overhead of immediate maintenance would be large in this case. If you would like to insert large data to a underlying table, I recommend to disable immediate maintenance by executing refesh_immv() function with with_data = false before inserting data. After that, call refresh_immv with with_data = true to refresh the view data and enable immediate maintenance. Whether the refresh takes longer time than the immediate maintenance or it takes shorter will depend on the size of data and the view definition, but it would be hard for now to predict it a priori. |
Is it possible to ingest data as a batch using pg_ivm? I am looking for a way to efficiently handle large volumes of data and would like to know if pg_ivm supports batch processing.
@ibhaskar2 ,@Jamal-B, @yugo-n , @tatsuo-ishii , @hanefi
The text was updated successfully, but these errors were encountered: