You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I thought that I was able to regenerate the warehouse from the data lake just by deleting the destination table and running DBT again but today I was hit by the limitation:
HIVE_TOO_MANY_OPEN_PARTITIONS: Exceeded limit of 100 open writers for partitions/buckets.
And looking at the documentation here, it seems like a known issue.
Since I'm partitioning by date and this is a sensible thing to do for the data I have, how do people here ingest data that is older than 3 months during the first non incremental run of DBT + Athena?
The text was updated successfully, but these errors were encountered:
Hi @nicor88 thanks for pointing to the right direction. Do you know of any code snippet with this modifications for Athena? It looks a bit intimidating since it has lots of specific macro constructs.
I thought that I was able to regenerate the warehouse from the data lake just by deleting the destination table and running DBT again but today I was hit by the limitation:
And looking at the documentation here, it seems like a known issue.
Since I'm partitioning by date and this is a sensible thing to do for the data I have, how do people here ingest data that is older than 3 months during the first non incremental run of DBT + Athena?
The text was updated successfully, but these errors were encountered: