-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implement relations api #727
Conversation
…implement-relations-api
dbt/adapters/redshift/impl.py
Outdated
def drop(cls, profile, schema, relation, relation_type, model_name=None): | ||
def drop_relation(cls, profile, relation, model_name=None): | ||
""" | ||
In Redshift, DROP TABLE cannot be used inside a transaction. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nbd, but drop table...
can be used inside of a transaction... it cannot be used inside a transaction if executed against an external table (ie. Spectrum).
This drop lock exists to prevent two drop...cascade
s from running at the same time.
If view_x
depends on tables model_a
and model_b
, and the following two statements run concurrently, Redshift will fire a table was dropped by a concurrent transaction
error:
-- via model_a.sql
drop table model_a cascade;
-- via model_b.sql
drop table model_b cascade;
Presumably because dropping model_a
caused view_x
to be dropped while model_b
was trying to drop it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes, good memory. i knew this wasn't quite right as i was writing it, i'll fix it. thanks
…cs/dbt into quote-config
logging.getLogger('urllib3').setLevel(logging.INFO) | ||
logging.getLogger('google').setLevel(logging.INFO) | ||
logging.getLogger('snowflake.connector').setLevel(logging.INFO) | ||
logging.getLogger('parsedatetime').setLevel(logging.INFO) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@drewbanin i'm confused as to how these got in here...?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
i tried to quiet down the verbosity of logs in the logs/dbt.log
file. Tristan saw debug longs in his console -- I wonder if that's related?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@cmcarthur there's a lot in here, but everything looked reasonable to me. I think we can go ahead and merge this, then stress test it like crazy in development. Sounds like some folks in the community are willing to help us test too :)
@echo "Changed test files:" | ||
@echo "${changed_tests}" | ||
@docker-compose run test /usr/src/app/test/runner.sh ${changed_tests} | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this is a good change :)
def approximate_relation_match(target, relation): | ||
raise_compiler_error( | ||
'When searching for a relation, dbt found an approximate match. ' | ||
'Instead of guessing \nwhich relation to use, dbt will move on. ' |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lol good change
logging.getLogger('urllib3').setLevel(logging.INFO) | ||
logging.getLogger('google').setLevel(logging.INFO) | ||
logging.getLogger('snowflake.connector').setLevel(logging.INFO) | ||
logging.getLogger('parsedatetime').setLevel(logging.INFO) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
i tried to quiet down the verbosity of logs in the logs/dbt.log
file. Tristan saw debug longs in his console -- I wonder if that's related?
…o model-aliasing $ git merge kickstarter/feature/model-aliasing CONFLICT (content): Merge conflict in dbt/utils.py CONFLICT (modify/delete): dbt/include/global_project/macros/materializations/table.sql deleted in HEAD and modified in kickstarter/feature/model-aliasing. Version kickstarter/feature/model-aliasing of dbt/include/global_project/macros/materializations/table.sql left in tree. CONFLICT (modify/delete): dbt/include/global_project/macros/materializations/bigquery.sql deleted in HEAD and modified in kickstarter/feature/model-aliasing. Version kickstarter/feature/model-aliasing of dbt/include/global_project/macros/materializations/bigquery.sql left in tree. 1. dbt/utils.py Some major changes are being introduced in 0.10.1: Implement relations api (dbt-labs#727) dbt-labs#727 dbt-labs@5344f54#diff-196bbfafed32edaf1554550f65111f87 The Relation class was extracted into ... ./dbt/api/object.py class APIObject(dict) ./dbt/adapters/default/relation.py class DefaultRelation(APIObject) ./dbt/adapters/bigquery/relation.py class BigQueryRelation(DefaultRelation) ./dbt/adapters/snowflake/relation.py class SnowflakeRelation(DefaultRelation) Changing node.get('name') to node.get('alias') ... ./dbt/adapters/default/relation.py ./dbt/adapters/bigquery/relation.py ./dbt/adapters/snowflake/relation.py 2. dbt/include/global_project/macros/materializations/table.sql This was renamed to ... ./dbt/include/global_project/macros/materializations/table/table.sql 3. dbt/include/global_project/macros/materializations/bigquery.sql This was split into ... ./dbt/include/global_project/macros/materializations/table/bigquery_table.sql and ... ./dbt/include/global_project/macros/materializations/view/bigquery_view.sql 4. other instances of model['name'] The following file also mention model['name'] and probably need to change as well ... ./dbt/include/global_project/macros/materializations/archive/archive.sql ./dbt/include/global_project/macros/materializations/seed/bigquery.sql ./dbt/include/global_project/macros/materializations/seed/seed.sql Added comentary to ... ./dbt/exceptions.py 5. further changes Revert model.get('alias') to model.get('name') ... print_test_result_line in ./dbt/ui/printer.py (since in this context schema is NOT being used) Change model.get('name') to model.get('alias') ... print_seed_result_line in ./dbt/ui/printer.py (since in this context schema is also being used) Change node.get('name') to node.get('alias') ... _node_context in ./dbt/node_runners.py (since in this context schema is also being used) call_get_missing_columns in ./dbt/node_runners.py (since in this context schema is also being used) call_already_exists in ./dbt/node_runners.py (since in this context schema is also being used) 6. linting import lines must be under 80 characters https://www.python.org/dev/peps/pep-0328/
automatic commit by git-black, original commits: 5344f54
WIP