You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have searched the existing issues, and I could not find an existing issue for this bug
Current Behavior
We're observing a high failure rate after upgrading our DBT pipelines from version 1.5 to Versionless.
Our pipelines are failing with this error:
Redshift adapter: Redshift error: could not open relation with OID 155466714
And the failing query is this:
select
table_catalog as database,
table_name as name,
table_schema as schema,
'table'as type
frominformation_schema.tableswhere table_schema ilike 'analytics'and table_type ='BASE TABLE'union allselect
table_catalog as database,
table_name as name,
table_schema as schema,
case
when view_definition ilike '%create materialized view%'
then 'materialized_view'
else 'view'
end as type
frominformation_schema.viewswhere table_schema ilike 'analytics'
My understanding of querying information_schema is that Redshift returns an error if a table is dropped while the query runs. And, DBT creates and drops some transient tables as it runs.
It's important to note that we have 17 pipelines that run simultaneously. Most of them run every 10 minutes and some every 2 minutes.
Expected Behavior
DBT commands do not fail on quering information_schema.
I would expect DBT to retry the query as it tries to build the catalog.
Steps To Reproduce
Create a simple table model simple_model
{{ config(materialized ='table') }}
select1as dim
Open the first terminal and run model materialization in a loop:
while dbt --debug run -s models/simple_model.sql;doecho"trying again";done
Open the second terminal and run docs generation in a loop:
while dbt docs generate -s models/simple_model.sql;doecho"again";done
After a while one of the commands will stop with this error:
12:30:18 Building catalog
12:30:23 Encountered an error while generating catalog: Database Error
could not open relation with OID 155700137
12:30:24 dbt encountered 1 failure while writing the catalog
Relevant log output
Attaching logs from the reproduction example from both terminals at the time of failure.
Logs from dbt --debug run -s models/simple_model.sql:
12:39:56 Sending event: {'category': 'dbt', 'action': 'invocation', 'label': 'start', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x7f4a08e99940>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x7f4a07579e20>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x7f4a07579dc0>]}
12:39:56 Running with dbt=1.8.4
12:39:56 running dbt with arguments {'printer_width': '80', 'indirect_selection': 'eager', 'log_cache_events': 'False', 'write_json': 'True', 'partial_parse': 'True', 'cache_selected_only': 'False', 'profiles_dir': '/home/jakub/.dbt', 'debug': 'True', 'fail_fast': 'False', 'log_path': '/home/jakub/dev/src/warehouse/src/dbt/src/analytics/logs', 'version_check': 'True', 'warn_error': 'None', 'use_colors': 'True', 'use_experimental_parser': 'False', 'no_print': 'None', 'quiet': 'False', 'empty': 'False', 'log_format': 'default', 'invocation_command': 'dbt --debug run -s models/simple_model.sql', 'warn_error_options': 'WarnErrorOptions(include=[], exclude=[])', 'introspect': 'True', 'target_path': 'None', 'static_parser': 'True', 'send_anonymous_usage_stats': 'True'}
12:39:56 Redshift adapter: Setting redshift_connector to ERROR
12:39:56 Redshift adapter: Setting redshift_connector.core to ERROR
12:39:56 Sending event: {'category': 'dbt', 'action': 'project_id', 'label': '75bb7eaa-53c2-4886-a297-0c01023e3a25', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x7f4a08c61370>]}
12:39:56 Sending event: {'category': 'dbt', 'action': 'adapter_info', 'label': '75bb7eaa-53c2-4886-a297-0c01023e3a25', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x7f4a06951c70>]}
12:39:56 Registered adapter: redshift=1.8.1
12:39:56 checksum: 54398663eddc8e4ac172fdab397e09b2aa6984947a84e6ec7983096c3fae7b7a, vars: {}, profile: , target: , version: 1.8.4
12:39:57 Partial parsing enabled: 0 files deleted, 0 files added, 0 files changed.
12:39:57 Partial parsing enabled, no changes found, skipping parsing
12:39:57 Sending event: {'category': 'dbt', 'action': 'load_project', 'label': '75bb7eaa-53c2-4886-a297-0c01023e3a25', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x7f4a04980130>]}
12:39:58 Sending event: {'category': 'dbt', 'action': 'resource_counts', 'label': '75bb7eaa-53c2-4886-a297-0c01023e3a25', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x7f4a037604c0>]}
12:39:58 Found 154 models, 4 snapshots, 1074 data tests, 1 seed, 1 operation, 77 sources, 905 macros
12:39:58 Sending event: {'category': 'dbt', 'action': 'runnable_timing', 'label': '75bb7eaa-53c2-4886-a297-0c01023e3a25', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x7f4a037604c0>]}
12:39:58
12:39:58 Acquiring new redshift connection 'master'
12:39:58 Acquiring new redshift connection 'list_warehouse'
12:39:58 Using redshift connection "list_warehouse"
12:39:58 On list_warehouse: /* {"app": "dbt", "dbt_version": "1.8.4", "profile_name": "analytics", "target_name": "dev", "connection_name": "list_warehouse"} */
selectdistinct nspname from pg_namespace
12:39:58 Opening a new connection, currently in state init
12:39:58 Redshift adapter: Establishing connection using ssl with `sslmode`set to 'prefer'.To connect without ssl, set`sslmode` to 'disable'.
12:39:58 Redshift adapter: Connecting to redshift with username/password based auth...
12:40:00 SQL status: SUCCESS in 2.211 seconds
12:40:00 On list_warehouse: Close
12:40:00 Re-using an available connection from the pool (formerly list_warehouse, now list_warehouse_snapshots)
12:40:00 Acquiring new redshift connection 'list_warehouse_analytics_jakub_stuff'
12:40:00 Acquiring new redshift connection 'list_warehouse_analytics_jakub'
12:40:00 Using redshift connection "list_warehouse_snapshots"
12:40:00 Using redshift connection "list_warehouse_analytics_jakub_stuff"
12:40:00 Using redshift connection "list_warehouse_analytics_jakub"
12:40:00 On list_warehouse_snapshots: BEGIN
12:40:00 On list_warehouse_analytics_jakub_stuff: BEGIN
12:40:00 On list_warehouse_analytics_jakub: BEGIN
12:40:00 Opening a new connection, currently in state closed
12:40:00 Opening a new connection, currently in state init
12:40:00 Opening a new connection, currently in state init
12:40:00 Redshift adapter: Establishing connection using ssl with `sslmode`set to 'prefer'.To connect without ssl, set`sslmode` to 'disable'.
12:40:00 Redshift adapter: Establishing connection using ssl with `sslmode`set to 'prefer'.To connect without ssl, set`sslmode` to 'disable'.
12:40:00 Redshift adapter: Establishing connection using ssl with `sslmode`set to 'prefer'.To connect without ssl, set`sslmode` to 'disable'.
12:40:00 Redshift adapter: Connecting to redshift with username/password based auth...
12:40:00 Redshift adapter: Connecting to redshift with username/password based auth...
12:40:00 Redshift adapter: Connecting to redshift with username/password based auth...
12:40:02 SQL status: SUCCESS in 1.864 seconds
12:40:02 Using redshift connection "list_warehouse_analytics_jakub_stuff"
12:40:02 On list_warehouse_analytics_jakub_stuff: /* {"app": "dbt", "dbt_version": "1.8.4", "profile_name": "analytics", "target_name": "dev", "connection_name": "list_warehouse_analytics_jakub_stuff"} */
select
table_catalog as database,
table_name as name,
table_schema as schema,
'table' as type
from information_schema.tables
where table_schema ilike 'analytics_jakub_stuff'
and table_type = 'BASE TABLE'
union all
select
table_catalog as database,
table_name as name,
table_schema as schema,
case
when view_definition ilike '%create materialized view%'then'materialized_view'else'view'
end as type
from information_schema.views
where table_schema ilike 'analytics_jakub_stuff'
12:40:02 SQL status: SUCCESS in 1.917 seconds
12:40:02 Using redshift connection "list_warehouse_analytics_jakub"
12:40:02 On list_warehouse_analytics_jakub: /* {"app": "dbt", "dbt_version": "1.8.4", "profile_name": "analytics", "target_name": "dev", "connection_name": "list_warehouse_analytics_jakub"} */
select
table_catalog as database,
table_name as name,
table_schema as schema,
'table' as type
from information_schema.tables
where table_schema ilike 'analytics_jakub'
and table_type = 'BASE TABLE'
union all
select
table_catalog as database,
table_name as name,
table_schema as schema,
case
when view_definition ilike '%create materialized view%'
then 'materialized_view'
else 'view'
end as type
from information_schema.views
where table_schema ilike 'analytics_jakub'
12:40:02 SQL status: SUCCESS in 1.921 seconds
12:40:02 Using redshift connection "list_warehouse_snapshots"
12:40:02 On list_warehouse_snapshots: /* {"app": "dbt", "dbt_version": "1.8.4", "profile_name": "analytics", "target_name": "dev", "connection_name": "list_warehouse_snapshots"} */
select
table_catalog as database,
table_name as name,
table_schema as schema,
'table' as type
from information_schema.tables
where table_schema ilike 'snapshots'
and table_type = 'BASE TABLE'
union all
select
table_catalog as database,
table_name as name,
table_schema as schema,
case
when view_definition ilike '%create materialized view%'
then 'materialized_view'
else 'view'
end as type
from information_schema.views
where table_schema ilike 'snapshots'
12:40:02 SQL status: SUCCESS in 0.414 seconds
12:40:02 On list_warehouse_analytics_jakub_stuff: ROLLBACK
12:40:02 SQL status: SUCCESS in 0.451 seconds
12:40:02 On list_warehouse_snapshots: ROLLBACK
12:40:02 SQL status: SUCCESS in 0.459 seconds
12:40:02 On list_warehouse_analytics_jakub: ROLLBACK
12:40:03 On list_warehouse_analytics_jakub_stuff: Close
12:40:03 On list_warehouse_snapshots: Close
12:40:03 On list_warehouse_analytics_jakub: Close
12:40:03 Using redshift connection "master"
12:40:03 On master: BEGIN
12:40:03 Opening a new connection, currently in state init
12:40:03 Redshift adapter: Establishing connection using ssl with `sslmode` set to 'prefer'.To connect without ssl, set `sslmode` to 'disable'.
12:40:03 Redshift adapter: Connecting to redshift with username/password based auth...
12:40:05 SQL status: SUCCESS in 1.861 seconds
12:40:05 Using redshift connection "master"
12:40:05 On master: /* {"app": "dbt", "dbt_version": "1.8.4", "profile_name": "analytics", "target_name": "dev", "connection_name": "master"} */
with
relation as (
select
pg_class.oid as relation_id,
pg_class.relname as relation_name,
pg_class.relnamespace as schema_id,
pg_namespace.nspname as schema_name,
pg_class.relkind as relation_type
from pg_class
join pg_namespace
on pg_class.relnamespace = pg_namespace.oid
where pg_namespace.nspname != 'information_schema'
and pg_namespace.nspname not like 'pg\_%'
),
dependency as (
select distinct
coalesce(pg_rewrite.ev_class, pg_depend.objid) as dep_relation_id,
pg_depend.refobjid as ref_relation_id,
pg_depend.refclassid as ref_class_id
from pg_depend
left join pg_rewrite
on pg_depend.objid = pg_rewrite.oid
where coalesce(pg_rewrite.ev_class, pg_depend.objid) != pg_depend.refobjid
)
selectdistinct
dep.schema_name as dependent_schema,
dep.relation_name as dependent_name,
ref.schema_name as referenced_schema,
ref.relation_name as referenced_name
from dependency
join relation ref
on dependency.ref_relation_id = ref.relation_id
join relation dep
on dependency.dep_relation_id = dep.relation_id
12:40:05 SQL status: SUCCESS in 0.535 seconds
12:40:05 Sending event: {'category': 'dbt', 'action': 'runnable_timing', 'label': '75bb7eaa-53c2-4886-a297-0c01023e3a25', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x7f4a03760310>]}
12:40:05 On master: ROLLBACK
12:40:06 Using redshift connection "master"
12:40:06 On master: BEGIN
12:40:06 SQL status: SUCCESS in 0.183 seconds
12:40:06 On master: COMMIT
12:40:06 Using redshift connection "master"
12:40:06 On master: COMMIT
12:40:06 SQL status: SUCCESS in 0.365 seconds
12:40:06 On master: Close
12:40:06 Concurrency: 8 threads (target='dev')
12:40:06
12:40:06 Began running node model.analytics.simple_model
12:40:06 1 of 1 START sql table model analytics_jakub.simple_model ...................... [RUN]
12:40:06 Re-using an available connection from the pool (formerly list_warehouse_analytics_jakub, now model.analytics.simple_model)
12:40:06 Began compiling node model.analytics.simple_model
12:40:06 Writing injected SQL for node "model.analytics.simple_model"
12:40:06 Began executing node model.analytics.simple_model
12:40:06 Writing runtime sql for node "model.analytics.simple_model"
12:40:06 Using redshift connection "model.analytics.simple_model"
12:40:06 On model.analytics.simple_model: BEGIN
12:40:06 Opening a new connection, currently in state closed
12:40:06 Redshift adapter: Establishing connection using ssl with `sslmode`set to 'prefer'.To connect without ssl, set`sslmode` to 'disable'.
12:40:06 Redshift adapter: Connecting to redshift with username/password based auth...
12:40:08 SQL status: SUCCESS in 1.907 seconds
12:40:08 Using redshift connection "model.analytics.simple_model"
12:40:08 On model.analytics.simple_model: /* {"app": "dbt", "dbt_version": "1.8.4", "profile_name": "analytics", "target_name": "dev", "node_id": "model.analytics.simple_model"} */
create table
"warehouse"."analytics_jakub"."simple_model__dbt_tmp"
as (
select1 as dim
);
12:40:11 SQL status: SUCCESS in 2.365 seconds
12:40:11 Using redshift connection "model.analytics.simple_model"
12:40:11 On model.analytics.simple_model: /* {"app": "dbt", "dbt_version": "1.8.4", "profile_name": "analytics", "target_name": "dev", "node_id": "model.analytics.simple_model"} */
alter table "warehouse"."analytics_jakub"."simple_model" rename to "simple_model__dbt_backup"
12:40:12 SQL status: SUCCESS in 1.291 seconds
12:40:12 Using redshift connection "model.analytics.simple_model"
12:40:12 On model.analytics.simple_model: /* {"app": "dbt", "dbt_version": "1.8.4", "profile_name": "analytics", "target_name": "dev", "node_id": "model.analytics.simple_model"} */
alter table "warehouse"."analytics_jakub"."simple_model__dbt_tmp" rename to "simple_model"
12:40:13 SQL status: SUCCESS in 0.795 seconds
12:40:13 On model.analytics.simple_model: COMMIT
12:40:13 Using redshift connection "model.analytics.simple_model"
12:40:13 On model.analytics.simple_model: COMMIT
12:40:14 SQL status: SUCCESS in 1.662 seconds
12:40:14 Using redshift connection "model.analytics.simple_model"
12:40:14 On model.analytics.simple_model: BEGIN
12:40:15 SQL status: SUCCESS in 0.400 seconds
12:40:15 Applying DROP to: "warehouse"."analytics_jakub"."simple_model__dbt_backup"
12:40:15 Using redshift connection "model.analytics.simple_model"
12:40:15 On model.analytics.simple_model: /* {"app": "dbt", "dbt_version": "1.8.4", "profile_name": "analytics", "target_name": "dev", "node_id": "model.analytics.simple_model"} */
drop table if exists "warehouse"."analytics_jakub"."simple_model__dbt_backup" cascade
12:40:15 SQL status: SUCCESS in 0.399 seconds
12:40:15 On model.analytics.simple_model: COMMIT
12:40:15 Using redshift connection "model.analytics.simple_model"
12:40:15 On model.analytics.simple_model: COMMIT
12:40:17 SQL status: SUCCESS in 2.124 seconds
12:40:17 Using redshift connection "model.analytics.simple_model"
12:40:17 On model.analytics.simple_model: BEGIN
12:40:17 SQL status: SUCCESS in 0.181 seconds
12:40:17 On model.analytics.simple_model: ROLLBACK
12:40:18 On model.analytics.simple_model: Close
12:40:18 Sending event: {'category': 'dbt', 'action': 'run_model', 'label': '75bb7eaa-53c2-4886-a297-0c01023e3a25', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x7f4a0485a1c0>]}
12:40:18 1 of 1 OK created sql table model analytics_jakub.simple_model ................. [SUCCESS in 11.62s]
12:40:18 Finished running node model.analytics.simple_model
12:40:18 Using redshift connection "master"
12:40:18 On master: BEGIN
12:40:18 Opening a new connection, currently in state closed
12:40:18 Redshift adapter: Establishing connection using ssl with `sslmode`set to 'prefer'.To connect without ssl, set`sslmode` to 'disable'.
12:40:18 Redshift adapter: Connecting to redshift with username/password based auth...
12:40:20 SQL status: SUCCESS in 1.909 seconds
12:40:20 On master: COMMIT
12:40:20 Using redshift connection "master"
12:40:20 On master: COMMIT
12:40:20 SQL status: SUCCESS in 0.364 seconds
12:40:20
12:40:20 Running 1 on-run-end hook
12:40:20 Writing injected SQL for node "operation.analytics.analytics-on-run-end-0"
12:40:20 1 of 1 START hook: analytics.on-run-end.0 ...................................... [RUN]
12:40:20 1 of 1 OK hook: analytics.on-run-end.0 ......................................... [OK in 0.00s]
12:40:20
12:40:20 On master: Close
12:40:20 Connection 'master' was properly closed.
12:40:20 Connection 'list_warehouse_snapshots' was properly closed.
12:40:20 Connection 'list_warehouse_analytics_jakub_stuff' was properly closed.
12:40:20 Connection 'model.analytics.simple_model' was properly closed.
12:40:20
12:40:20 Finished running 1 table model, 1 project hook in 0 hours 0 minutes and 22.32 seconds (22.32s).
12:40:20 Command end result
12:40:21
12:40:21 Completed successfully
12:40:21
12:40:21 Done. PASS=1 WARN=0 ERROR=0 SKIP=0 TOTAL=1
12:40:21 Resource report: {"command_name": "run", "command_success": true, "command_wall_clock_time": 24.687197, "process_user_time": 3.904314, "process_kernel_time": 0.1623, "process_mem_max_rss": "151160", "process_out_blocks": "24568", "process_in_blocks": "0"}
12:40:21 Command `dbt run` succeeded at 14:40:21.073347 after 24.69 seconds
12:40:21 Sending event: {'category': 'dbt', 'action': 'invocation', 'label': 'end', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x7f4a08e99940>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x7f4a075aee80>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x7f4a06956580>]}
12:40:21 Flushing usage events
trying again
12:40:23 Sending event: {'category': 'dbt', 'action': 'invocation', 'label': 'start', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x7fa54604ca60>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x7fa54472fd60>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x7fa54472fdf0>]}
12:40:23 Running with dbt=1.8.4
12:40:23 running dbt with arguments {'printer_width': '80', 'indirect_selection': 'eager', 'write_json': 'True', 'log_cache_events': 'False', 'partial_parse': 'True', 'cache_selected_only': 'False', 'profiles_dir': '/home/jakub/.dbt', 'fail_fast': 'False', 'version_check': 'True', 'log_path': '/home/jakub/dev/src/warehouse/src/dbt/src/analytics/logs', 'debug': 'True', 'warn_error': 'None', 'use_colors': 'True', 'use_experimental_parser': 'False', 'no_print': 'None', 'quiet': 'False', 'empty': 'False', 'warn_error_options': 'WarnErrorOptions(include=[], exclude=[])', 'static_parser': 'True', 'log_format': 'default', 'invocation_command': 'dbt --debug run -s models/simple_model.sql', 'target_path': 'None', 'introspect': 'True', 'send_anonymous_usage_stats': 'True'}
12:40:23 Redshift adapter: Setting redshift_connector to ERROR
12:40:23 Redshift adapter: Setting redshift_connector.core to ERROR
12:40:23 Sending event: {'category': 'dbt', 'action': 'project_id', 'label': '6739be15-36d4-4293-84fb-5b075f5137a7', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x7fa5554330d0>]}
12:40:23 Sending event: {'category': 'dbt', 'action': 'adapter_info', 'label': '6739be15-36d4-4293-84fb-5b075f5137a7', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x7fa546b46b50>]}
12:40:23 Registered adapter: redshift=1.8.1
12:40:23 checksum: 54398663eddc8e4ac172fdab397e09b2aa6984947a84e6ec7983096c3fae7b7a, vars: {}, profile: , target: , version: 1.8.4
12:40:24 Partial parsing enabled: 0 files deleted, 0 files added, 0 files changed.
12:40:24 Partial parsing enabled, no changes found, skipping parsing
12:40:24 Sending event: {'category': 'dbt', 'action': 'load_project', 'label': '6739be15-36d4-4293-84fb-5b075f5137a7', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x7fa5419ff130>]}
12:40:25 Sending event: {'category': 'dbt', 'action': 'resource_counts', 'label': '6739be15-36d4-4293-84fb-5b075f5137a7', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x7fa54357c2b0>]}
12:40:25 Found 154 models, 4 snapshots, 1074 data tests, 1 seed, 1 operation, 77 sources, 905 macros
12:40:25 Sending event: {'category': 'dbt', 'action': 'runnable_timing', 'label': '6739be15-36d4-4293-84fb-5b075f5137a7', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x7fa54357c2b0>]}
12:40:25
12:40:25 Acquiring new redshift connection 'master'
12:40:25 Acquiring new redshift connection 'list_warehouse'
12:40:25 Using redshift connection "list_warehouse"
12:40:25 On list_warehouse: /* {"app": "dbt", "dbt_version": "1.8.4", "profile_name": "analytics", "target_name": "dev", "connection_name": "list_warehouse"} */
selectdistinct nspname from pg_namespace
12:40:25 Opening a new connection, currently in state init
12:40:25 Redshift adapter: Establishing connection using ssl with `sslmode`set to 'prefer'.To connect without ssl, set`sslmode` to 'disable'.
12:40:25 Redshift adapter: Connecting to redshift with username/password based auth...
12:40:27 SQL status: SUCCESS in 2.212 seconds
12:40:27 On list_warehouse: Close
12:40:27 Re-using an available connection from the pool (formerly list_warehouse, now list_warehouse_analytics_jakub_stuff)
12:40:27 Acquiring new redshift connection 'list_warehouse_analytics_jakub'
12:40:27 Acquiring new redshift connection 'list_warehouse_snapshots'
12:40:27 Using redshift connection "list_warehouse_analytics_jakub_stuff"
12:40:27 Using redshift connection "list_warehouse_analytics_jakub"
12:40:27 Using redshift connection "list_warehouse_snapshots"
12:40:27 On list_warehouse_analytics_jakub_stuff: BEGIN
12:40:27 On list_warehouse_analytics_jakub: BEGIN
12:40:27 On list_warehouse_snapshots: BEGIN
12:40:27 Opening a new connection, currently in state closed
12:40:27 Opening a new connection, currently in state init
12:40:27 Opening a new connection, currently in state init
12:40:27 Redshift adapter: Establishing connection using ssl with `sslmode`set to 'prefer'.To connect without ssl, set`sslmode` to 'disable'.
12:40:27 Redshift adapter: Establishing connection using ssl with `sslmode`set to 'prefer'.To connect without ssl, set`sslmode` to 'disable'.
12:40:27 Redshift adapter: Establishing connection using ssl with `sslmode`set to 'prefer'.To connect without ssl, set`sslmode` to 'disable'.
12:40:27 Redshift adapter: Connecting to redshift with username/password based auth...
12:40:27 Redshift adapter: Connecting to redshift with username/password based auth...
12:40:27 Redshift adapter: Connecting to redshift with username/password based auth...
^C12:40:29 SQL status: SUCCESS in 1.897 seconds
12:40:29 Using redshift connection "list_warehouse_analytics_jakub_stuff"
12:40:29 On list_warehouse_analytics_jakub_stuff: /* {"app": "dbt", "dbt_version": "1.8.4", "profile_name": "analytics", "target_name": "dev", "connection_name": "list_warehouse_analytics_jakub_stuff"} */
select
table_catalog as database,
table_name as name,
table_schema as schema,
'table' as type
from information_schema.tables
where table_schema ilike 'analytics_jakub_stuff'
and table_type = 'BASE TABLE'
union all
select
table_catalog as database,
table_name as name,
table_schema as schema,
case
when view_definition ilike '%create materialized view%'then'materialized_view'else'view'
end as type
from information_schema.views
where table_schema ilike 'analytics_jakub_stuff'
12:40:29 SQL status: SUCCESS in 1.902 seconds
12:40:29 Using redshift connection "list_warehouse_snapshots"
12:40:29 On list_warehouse_snapshots: /* {"app": "dbt", "dbt_version": "1.8.4", "profile_name": "analytics", "target_name": "dev", "connection_name": "list_warehouse_snapshots"} */
select
table_catalog as database,
table_name as name,
table_schema as schema,
'table' as type
from information_schema.tables
where table_schema ilike 'snapshots'
and table_type = 'BASE TABLE'
union all
select
table_catalog as database,
table_name as name,
table_schema as schema,
case
when view_definition ilike '%create materialized view%'
then 'materialized_view'
else 'view'
end as type
from information_schema.views
where table_schema ilike 'snapshots'
12:40:29 SQL status: SUCCESS in 1.929 seconds
12:40:29 Using redshift connection "list_warehouse_analytics_jakub"
12:40:29 On list_warehouse_analytics_jakub: /* {"app": "dbt", "dbt_version": "1.8.4", "profile_name": "analytics", "target_name": "dev", "connection_name": "list_warehouse_analytics_jakub"} */
select
table_catalog as database,
table_name as name,
table_schema as schema,
'table' as type
from information_schema.tables
where table_schema ilike 'analytics_jakub'
and table_type = 'BASE TABLE'
union all
select
table_catalog as database,
table_name as name,
table_schema as schema,
case
when view_definition ilike '%create materialized view%'
then 'materialized_view'
else 'view'
end as type
from information_schema.views
where table_schema ilike 'analytics_jakub'
12:40:29 SQL status: SUCCESS in 0.433 seconds
12:40:29 On list_warehouse_analytics_jakub: ROLLBACK
12:40:30 On list_warehouse_analytics_jakub: Close
12:40:30 SQL status: SUCCESS in 0.837 seconds
12:40:30 On list_warehouse_snapshots: ROLLBACK
12:40:30 SQL status: SUCCESS in 0.854 seconds
12:40:30 On list_warehouse_analytics_jakub_stuff: ROLLBACK
12:40:30 On list_warehouse_snapshots: Close
12:40:30 On list_warehouse_analytics_jakub_stuff: Close
12:40:30 Connection 'master' was properly closed.
12:40:30 Connection 'list_warehouse_analytics_jakub_stuff' was properly closed.
12:40:30 Connection 'list_warehouse_analytics_jakub' was properly closed.
12:40:30 Connection 'list_warehouse_snapshots' was properly closed.
12:40:30
12:40:30 Finished running in 0 hours 0 minutes and 5.40 seconds (5.40s).
12:40:30 Encountered an error:
12:40:30 Traceback (most recent call last):
File "/home/jakub/dev/src/warehouse/src/dbt/src/analytics/.venv/lib/python3.9/site-packages/dbt/cli/requires.py", line 138, in wrapper
result, success = func(*args, **kwargs)
File "/home/jakub/dev/src/warehouse/src/dbt/src/analytics/.venv/lib/python3.9/site-packages/dbt/cli/requires.py", line 101, in wrapper
return func(*args, **kwargs)
File "/home/jakub/dev/src/warehouse/src/dbt/src/analytics/.venv/lib/python3.9/site-packages/dbt/cli/requires.py", line 218, in wrapper
return func(*args, **kwargs)
File "/home/jakub/dev/src/warehouse/src/dbt/src/analytics/.venv/lib/python3.9/site-packages/dbt/cli/requires.py", line 247, in wrapper
return func(*args, **kwargs)
File "/home/jakub/dev/src/warehouse/src/dbt/src/analytics/.venv/lib/python3.9/site-packages/dbt/cli/requires.py", line 294, in wrapper
return func(*args, **kwargs)
File "/home/jakub/dev/src/warehouse/src/dbt/src/analytics/.venv/lib/python3.9/site-packages/dbt/cli/requires.py", line 332, in wrapper
return func(*args, **kwargs)
File "/home/jakub/dev/src/warehouse/src/dbt/src/analytics/.venv/lib/python3.9/site-packages/dbt/cli/main.py", line 569, in run
results = task.run()
File "/home/jakub/dev/src/warehouse/src/dbt/src/analytics/.venv/lib/python3.9/site-packages/dbt/task/runnable.py", line 526, in run
result = self.execute_with_hooks(selected_uids)
File "/home/jakub/dev/src/warehouse/src/dbt/src/analytics/.venv/lib/python3.9/site-packages/dbt/task/runnable.py", line 486, in execute_with_hooks
self.before_run(adapter, selected_uids)
File "/home/jakub/dev/src/warehouse/src/dbt/src/analytics/.venv/lib/python3.9/site-packages/dbt/task/run.py", line 460, in before_run
self.populate_adapter_cache(adapter, required_schemas)
File "/home/jakub/dev/src/warehouse/src/dbt/src/analytics/.venv/lib/python3.9/site-packages/dbt/task/runnable.py", line 464, in populate_adapter_cache
adapter.set_relations_cache(cachable_nodes)
File "/home/jakub/dev/src/warehouse/src/dbt/src/analytics/.venv/lib/python3.9/site-packages/dbt/adapters/base/impl.py", line 529, in set_relations_cache
self._relations_cache_for_schemas(relation_configs, required_schemas)
File "/home/jakub/dev/src/warehouse/src/dbt/src/analytics/.venv/lib/python3.9/site-packages/dbt/adapters/redshift/impl.py", line 167, in _relations_cache_for_schemas
super()._relations_cache_for_schemas(manifest, cache_schemas)
File "/home/jakub/dev/src/warehouse/src/dbt/src/analytics/.venv/lib/python3.9/site-packages/dbt/adapters/base/impl.py", line 502, in _relations_cache_for_schemas
forfuturein as_completed(futures):
File "/home/jakub/.pyenv/versions/3.9.15/lib/python3.9/concurrent/futures/_base.py", line 245, in as_completed
waiter.event.wait(wait_timeout)
File "/home/jakub/.pyenv/versions/3.9.15/lib/python3.9/threading.py", line 581, inwait
signaled = self._cond.wait(timeout)
File "/home/jakub/.pyenv/versions/3.9.15/lib/python3.9/threading.py", line 312, inwaitwaiter.acquire()
KeyboardInterrupt
12:40:30 Resource report: {"command_name": "run", "command_wall_clock_time": 7.313435, "process_user_time": 3.114096, "process_kernel_time": 0.134273, "process_mem_max_rss": "138696", "process_out_blocks": "14704", "command_success": false, "process_in_blocks": "0"}
12:40:30 Command `dbt run` failed at 14:40:30.593596 after 7.31 seconds
12:40:30 Sending event: {'category': 'dbt', 'action': 'invocation', 'label': 'end', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x7fa54604ca60>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x7fa5408e9520>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x7fa5408e0790>]}
12:40:30 Flushing usage events
Logs from dbt --debug docs generate -s models/simple_model.sql:
12:40:10 Sending event: {'category': 'dbt', 'action': 'invocation', 'label': 'start', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x7fcd5bb49a00>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x7fcd5a22cd30>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x7fcd5a22cd90>]}
12:40:10 Running with dbt=1.8.4
12:40:10 running dbt with arguments {'printer_width': '80', 'indirect_selection': 'eager', 'log_cache_events': 'False', 'write_json': 'True', 'partial_parse': 'True', 'cache_selected_only': 'False', 'profiles_dir': '/home/jakub/.dbt', 'version_check': 'True', 'fail_fast': 'False', 'log_path': '/home/jakub/dev/src/warehouse/src/dbt/src/analytics/logs', 'warn_error': 'None', 'debug': 'True', 'use_colors': 'True', 'use_experimental_parser': 'False', 'empty': 'None', 'quiet': 'False', 'no_print': 'None', 'log_format': 'default', 'invocation_command': 'dbt --debug docs generate -s models/simple_model.sql', 'static_parser': 'True', 'introspect': 'True', 'target_path': 'None', 'warn_error_options': 'WarnErrorOptions(include=[], exclude=[])', 'send_anonymous_usage_stats': 'True'}
12:40:10 Redshift adapter: Setting redshift_connector to ERROR
12:40:10 Redshift adapter: Setting redshift_connector.core to ERROR
12:40:11 Sending event: {'category': 'dbt', 'action': 'project_id', 'label': '725cf8f0-6c1e-44e3-99ca-6ab888f13e57', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x7fcd59587640>]}
12:40:11 Sending event: {'category': 'dbt', 'action': 'adapter_info', 'label': '725cf8f0-6c1e-44e3-99ca-6ab888f13e57', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x7fcd59c26d00>]}
12:40:11 Registered adapter: redshift=1.8.1
12:40:11 checksum: 54398663eddc8e4ac172fdab397e09b2aa6984947a84e6ec7983096c3fae7b7a, vars: {}, profile: , target: , version: 1.8.4
12:40:11 Partial parsing enabled: 0 files deleted, 0 files added, 0 files changed.
12:40:11 Partial parsing enabled, no changes found, skipping parsing
12:40:11 Sending event: {'category': 'dbt', 'action': 'load_project', 'label': '725cf8f0-6c1e-44e3-99ca-6ab888f13e57', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x7fcd57586eb0>]}
12:40:11 Sending event: {'category': 'dbt', 'action': 'resource_counts', 'label': '725cf8f0-6c1e-44e3-99ca-6ab888f13e57', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x7fcd565ae520>]}
12:40:11 Found 154 models, 4 snapshots, 1074 data tests, 1 seed, 1 operation, 77 sources, 905 macros
12:40:11 Sending event: {'category': 'dbt', 'action': 'runnable_timing', 'label': '725cf8f0-6c1e-44e3-99ca-6ab888f13e57', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x7fcd565ae790>]}
12:40:12
12:40:12 Acquiring new redshift connection 'master'
12:40:12 Acquiring new redshift connection 'list_warehouse_analytics_jakub_stuff'
12:40:12 Acquiring new redshift connection 'list_warehouse_snapshots'
12:40:12 Acquiring new redshift connection 'list_warehouse_analytics_jakub'
12:40:12 Using redshift connection "list_warehouse_analytics_jakub_stuff"
12:40:12 Using redshift connection "list_warehouse_snapshots"
12:40:12 Using redshift connection "list_warehouse_analytics_jakub"
12:40:12 On list_warehouse_analytics_jakub_stuff: BEGIN
12:40:12 On list_warehouse_snapshots: BEGIN
12:40:12 On list_warehouse_analytics_jakub: BEGIN
12:40:12 Opening a new connection, currently in state init
12:40:12 Opening a new connection, currently in state init
12:40:12 Opening a new connection, currently in state init
12:40:12 Redshift adapter: Establishing connection using ssl with `sslmode` set to 'prefer'.To connect without ssl, set `sslmode` to 'disable'.
12:40:12 Redshift adapter: Establishing connection using ssl with `sslmode` set to 'prefer'.To connect without ssl, set `sslmode` to 'disable'.
12:40:12 Redshift adapter: Establishing connection using ssl with `sslmode` set to 'prefer'.To connect without ssl, set `sslmode` to 'disable'.
12:40:12 Redshift adapter: Connecting to redshift with username/password based auth...
12:40:12 Redshift adapter: Connecting to redshift with username/password based auth...
12:40:12 Redshift adapter: Connecting to redshift with username/password based auth...
12:40:14 SQL status: SUCCESS in 2.066 seconds
12:40:14 Using redshift connection "list_warehouse_snapshots"
12:40:14 On list_warehouse_snapshots: /* {"app": "dbt", "dbt_version": "1.8.4", "profile_name": "analytics", "target_name": "dev", "connection_name": "list_warehouse_snapshots"} */
select
table_catalog as database,
table_name as name,
table_schema as schema,
'table' as type
from information_schema.tables
where table_schema ilike 'snapshots'
and table_type = 'BASE TABLE'
union all
select
table_catalog as database,
table_name as name,
table_schema as schema,
case
when view_definition ilike '%create materialized view%'
then 'materialized_view'
else 'view'
end as type
from information_schema.views
where table_schema ilike 'snapshots'
12:40:14 SQL status: SUCCESS in 2.101 seconds
12:40:14 Using redshift connection "list_warehouse_analytics_jakub"
12:40:14 On list_warehouse_analytics_jakub: /* {"app": "dbt", "dbt_version": "1.8.4", "profile_name": "analytics", "target_name": "dev", "connection_name": "list_warehouse_analytics_jakub"} */
select
table_catalog as database,
table_name as name,
table_schema as schema,
'table' as type
from information_schema.tables
where table_schema ilike 'analytics_jakub'
and table_type = 'BASE TABLE'
union all
select
table_catalog as database,
table_name as name,
table_schema as schema,
case
when view_definition ilike '%create materialized view%'
then 'materialized_view'
else 'view'
end as type
from information_schema.views
where table_schema ilike 'analytics_jakub'
12:40:14 SQL status: SUCCESS in 2.127 seconds
12:40:14 Using redshift connection "list_warehouse_analytics_jakub_stuff"
12:40:14 On list_warehouse_analytics_jakub_stuff: /* {"app": "dbt", "dbt_version": "1.8.4", "profile_name": "analytics", "target_name": "dev", "connection_name": "list_warehouse_analytics_jakub_stuff"} */
select
table_catalog as database,
table_name as name,
table_schema as schema,
'table' as type
from information_schema.tables
where table_schema ilike 'analytics_jakub_stuff'
and table_type = 'BASE TABLE'
union all
select
table_catalog as database,
table_name as name,
table_schema as schema,
case
when view_definition ilike '%create materialized view%'
then 'materialized_view'
else 'view'
end as type
from information_schema.views
where table_schema ilike 'analytics_jakub_stuff'
12:40:14 SQL status: SUCCESS in 0.619 seconds
12:40:14 On list_warehouse_snapshots: ROLLBACK
12:40:15 SQL status: SUCCESS in 0.761 seconds
12:40:15 On list_warehouse_analytics_jakub: ROLLBACK
12:40:15 SQL status: SUCCESS in 0.830 seconds
12:40:15 On list_warehouse_analytics_jakub_stuff: ROLLBACK
12:40:15 On list_warehouse_snapshots: Close
12:40:15 On list_warehouse_analytics_jakub: Close
12:40:15 On list_warehouse_analytics_jakub_stuff: Close
12:40:15 Using redshift connection "master"
12:40:15 On master: BEGIN
12:40:15 Opening a new connection, currently in state init
12:40:15 Redshift adapter: Establishing connection using ssl with `sslmode` set to 'prefer'.To connect without ssl, set `sslmode` to 'disable'.
12:40:15 Redshift adapter: Connecting to redshift with username/password based auth...
12:40:17 SQL status: SUCCESS in 1.935 seconds
12:40:17 Using redshift connection "master"
12:40:17 On master: /* {"app": "dbt", "dbt_version": "1.8.4", "profile_name": "analytics", "target_name": "dev", "connection_name": "master"} */
with
relation as (
select
pg_class.oid as relation_id,
pg_class.relname as relation_name,
pg_class.relnamespace as schema_id,
pg_namespace.nspname as schema_name,
pg_class.relkind as relation_type
from pg_class
join pg_namespace
on pg_class.relnamespace = pg_namespace.oid
where pg_namespace.nspname != 'information_schema'
and pg_namespace.nspname not like 'pg\_%'
),
dependency as (
select distinct
coalesce(pg_rewrite.ev_class, pg_depend.objid) as dep_relation_id,
pg_depend.refobjid as ref_relation_id,
pg_depend.refclassid as ref_class_id
from pg_depend
left join pg_rewrite
on pg_depend.objid = pg_rewrite.oid
where coalesce(pg_rewrite.ev_class, pg_depend.objid) != pg_depend.refobjid
)
select distinct
dep.schema_name as dependent_schema,
dep.relation_name as dependent_name,
ref.schema_name as referenced_schema,
ref.relation_name as referenced_name
from dependency
join relation ref
on dependency.ref_relation_id = ref.relation_id
join relation dep
on dependency.dep_relation_id = dep.relation_id
12:40:18 SQL status: SUCCESS in 1.030 seconds
12:40:18 Sending event: {'category': 'dbt', 'action': 'runnable_timing', 'label': '725cf8f0-6c1e-44e3-99ca-6ab888f13e57', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x7fcd565ae790>]}
12:40:18 On master: ROLLBACK
12:40:18 On master: Close
12:40:18 Concurrency: 8 threads (target='dev')
12:40:18
12:40:18 Began running node model.analytics.simple_model
12:40:18 Re-using an available connection from the pool (formerly list_warehouse_analytics_jakub_stuff, now model.analytics.simple_model)
12:40:18 Began compiling node model.analytics.simple_model
12:40:18 Writing injected SQL for node "model.analytics.simple_model"
12:40:18 Began executing node model.analytics.simple_model
12:40:18 Finished running node model.analytics.simple_model
12:40:18 Connection 'master' was properly closed.
12:40:18 Connection 'model.analytics.simple_model' was properly closed.
12:40:18 Connection 'list_warehouse_snapshots' was properly closed.
12:40:18 Connection 'list_warehouse_analytics_jakub' was properly closed.
12:40:18 Command end result
12:40:19 Compiled node 'simple_model' is:
select 1 as dim
12:40:19 Acquiring new redshift connection 'generate_catalog'
12:40:19 Building catalog
12:40:19 Acquiring new redshift connection 'warehouse.information_schema'
12:40:19 Using redshift connection "warehouse.information_schema"
12:40:19 On warehouse.information_schema: BEGIN
12:40:19 Opening a new connection, currently in state init
12:40:19 Redshift adapter: Establishing connection using ssl with `sslmode` set to 'prefer'.To connect without ssl, set `sslmode` to 'disable'.
12:40:19 Redshift adapter: Connecting to redshift with username/password based auth...
12:40:21 SQL status: SUCCESS in 1.875 seconds
12:40:21 Using redshift connection "warehouse.information_schema"
12:40:21 On warehouse.information_schema: /* {"app": "dbt", "dbt_version": "1.8.4", "profile_name": "analytics", "target_name": "dev", "connection_name": "warehouse.information_schema"} */
with
late_binding as (
select
table_schema,
table_name,
'LATE BINDING VIEW'::varchar as table_type,
null::text as table_comment,
column_name,
column_index,
column_type,
null::text as column_comment
from pg_get_late_binding_view_cols()
cols(
table_schema name,
table_name name,
column_name name,
column_type varchar,
column_index int
)
where ((
upper(table_schema) = upper('analytics_jakub')
and upper(table_name) = upper('simple_model')
))
),
early_binding as (
select
sch.nspname as table_schema,
tbl.relname as table_name,
case
when tbl.relkind = 'v' and mat_views.table_name is not null then 'MATERIALIZED VIEW'
when tbl.relkind = 'v' then 'VIEW'
else 'BASE TABLE'
end as table_type,
tbl_desc.description as table_comment,
col.attname as column_name,
col.attnum as column_index,
pg_catalog.format_type(col.atttypid, col.atttypmod) as column_type,
col_desc.description as column_comment
from pg_catalog.pg_namespace sch
join pg_catalog.pg_class tbl
on tbl.relnamespace = sch.oid
join pg_catalog.pg_attribute col
on col.attrelid = tbl.oid
left outer join pg_catalog.pg_description tbl_desc
on tbl_desc.objoid = tbl.oid
and tbl_desc.objsubid = 0
left outer join pg_catalog.pg_description col_desc
on col_desc.objoid = tbl.oid
and col_desc.objsubid = col.attnum
left outer join information_schema.views mat_views
on mat_views.table_schema = sch.nspname
and mat_views.table_name = tbl.relname
and mat_views.view_definition ilike '%create materialized view%'
and mat_views.table_catalog = 'warehouse'
where tbl.relkind in ('r', 'v', 'f', 'p')
and col.attnum > 0
and not col.attisdropped
and ((
upper(sch.nspname) = upper('analytics_jakub')
and upper(tbl.relname) = upper('simple_model')
))
),
unioned as (select * from early_binding union all select * from late_binding),
table_owners as (
select
schemaname as table_schema,
tablename as table_name,
tableowner as table_owner
from pg_tables
union all
select
schemaname as table_schema,
viewname as table_name,
viewowner as table_owner
from pg_views
)
select 'warehouse' as table_database, *
from unioned
join table_owners using (table_schema, table_name)
order by "column_index"
12:40:25 Redshift adapter: Redshift error: could not open relation with OID 155701979
12:40:25 On warehouse.information_schema: ROLLBACK
12:40:26 Redshift adapter: Error running SQL: macro get_catalog_relations
12:40:26 Redshift adapter: Rolling back transaction.
12:40:26 On warehouse.information_schema: Close
12:40:26 Encountered an error while generating catalog: Database Error
could not open relation with OID 155701979
12:40:26 dbt encountered 1 failure while writing the catalog
12:40:26 Catalog written to /home/jakub/dev/src/warehouse/src/dbt/src/analytics/target/catalog.json
12:40:26 Resource report: {"command_name": "generate", "command_wall_clock_time": 15.7936, "process_user_time": 3.748757, "process_kernel_time": 0.189511, "process_mem_max_rss": "146840", "process_out_blocks": "27440", "command_success": false, "process_in_blocks": "0"}
12:40:26 Command `dbt docs generate` failed at 14:40:26.534113 after 15.80 seconds
12:40:26 Connection 'generate_catalog' was properly closed.
12:40:26 Connection 'warehouse.information_schema' was properly closed.
12:40:26 Sending event: {'category': 'dbt', 'action': 'invocation', 'label': 'end', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x7fcd5bb49a00>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x7fcd5c569e80>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x7fcd5b7bdbe0>]}
12:40:26 Flushing usage events
Environment
Local environment:
- OS: RHEL 9.4
- Python: 3.9.15
- dbt-core: 1.8.4
- dbt-redshift: 1.8.1
DBT Cloud:
- Deployment type: Production
- dbt versions: 1.6, 1.7, and Versionless
Is this a new bug in dbt-redshift?
Current Behavior
We're observing a high failure rate after upgrading our DBT pipelines from version 1.5 to Versionless.
Our pipelines are failing with this error:
And the failing query is this:
My understanding of querying
information_schema
is that Redshift returns an error if a table is dropped while the query runs. And, DBT creates and drops some transient tables as it runs.It's important to note that we have 17 pipelines that run simultaneously. Most of them run every 10 minutes and some every 2 minutes.
Expected Behavior
DBT commands do not fail on quering
information_schema
.I would expect DBT to retry the query as it tries to build the catalog.
Steps To Reproduce
simple_model
Relevant log output
Attaching logs from the reproduction example from both terminals at the time of failure.
Logs from
dbt --debug run -s models/simple_model.sql
:Logs from
dbt --debug docs generate -s models/simple_model.sql
:Environment
Additional Context
pip freeze:
The text was updated successfully, but these errors were encountered: