Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error building models as views with no schema binding #26

Open
davehowell opened this issue Jan 17, 2020 · 4 comments
Open

Error building models as views with no schema binding #26

davehowell opened this issue Jan 17, 2020 · 4 comments

Comments

@davehowell
Copy link

In my dbt_project.yml I specify that all views should not use schema binding i.e. "late binding views" which is recommended for redshift to avoid errors.

models:
  # Use late-binding views to avoid blocking referenced tables
  bind: false

This is because if dbt has a view that depends on a table materialization model, it will error when trying to drop/create the view.

The problem here is that the views https://github.com/fishtown-analytics/redshift/tree/master/models/views do not specify schemas in the underlying ephemeral models. The system views like stl_wlm_query live in the pg_catalog schema, so fixing this is as simple as qualifying those view references like pg_catalog.stl_wlm_query

This is similar to this issue https://stackoverflow.com/a/54447584/1335793

@drewbanin
Copy link
Contributor

Thanks @davehowell - are you able to send through a PR for this fix?

@davehowell
Copy link
Author

@drewbanin
Sure, the README doesn't have any info for developers, is there something you can point me to?
So far I discovered I need a redshift_package profile, so I added that to my profiles.yml pointing to my Redshift cluster.

If I dbt run the package it builds the views successfully.
If I modify the dbt_project.yml file with a global bind: false it reproduces the failures.

I noticed that some of the macros already include the pg_catalog. schema, but it's inconsistent. I don't see any tests for those macros those so not sure I should touch them.

@clrcrl
Copy link
Contributor

clrcrl commented Jan 22, 2020

This Discourse article outlines the best workflow for contributing to packages: https://discourse.getdbt.com/t/contributing-to-a-dbt-package/657

@davehowell
Copy link
Author

davehowell commented Jan 23, 2020

PR Added.

I also looked at the macros that are missing schema qualification. Some are referenced in the view models so are being indirectly tested that way, but many don't have existing tests; there is no embedded "integration_tests" project in this package as there is in the dbt-utils package, and I think it would need that to have any confidence in any macro changes so I haven't done them.

Some other macros have AWS dependencies; unload for example requires an S3 bucket and role (or key/secret key), so testing that one is a bit harder to do without perhaps getting CI to do it on Fishtown's infra.

I might look at adding a barebones integration_test later, but this works for now.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants