You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am new to dlt and I have tested it to load data from Postgres into Snowflake. dlt creates a staging schema which is apparently created to "deduplicate and merge data with the destination". Could it be possible to add an option to drop the staging schema after the loading operation? Currently dlt allows only truncating the table (resource) in the staging schema.
Are you a dlt user?
Yes, I use it for fun.
Use case
No response
Proposed solution
No response
Related issues
No response
The text was updated successfully, but these errors were encountered:
@AhmetSamilCicek we keep the staging dataset schema so the subsequent loads happen faster. it takes time to recreate all the tables. if you for some reason want to drop the staging dataset, you have an authenticated sql_client available after the load with which you can drop the staging dataset: pipeline.sql_client() and then you can drop it via explicit DDL statement or
Feature description
I am new to dlt and I have tested it to load data from Postgres into Snowflake. dlt creates a staging schema which is apparently created to "deduplicate and merge data with the destination". Could it be possible to add an option to drop the staging schema after the loading operation? Currently dlt allows only truncating the table (resource) in the staging schema.
Are you a dlt user?
Yes, I use it for fun.
Use case
No response
Proposed solution
No response
Related issues
No response
The text was updated successfully, but these errors were encountered: