Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Upgrade 0.18.0 release plus tests #105

Merged
merged 3 commits into from
Sep 16, 2020
Merged
Show file tree
Hide file tree
Changes from 2 commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .bumpversion-dbt.cfg
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
[bumpversion]
current_version = 0.18.0rc1
current_version = 0.18.0
parse = (?P<major>\d+)
\.(?P<minor>\d+)
\.(?P<patch>\d+)
Expand Down
2 changes: 1 addition & 1 deletion .bumpversion.cfg
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
[bumpversion]
current_version = 0.18.0rc1
current_version = 0.18.0
parse = (?P<major>\d+)
\.(?P<minor>\d+)
\.(?P<patch>\d+)
Expand Down
17 changes: 6 additions & 11 deletions .circleci/config.yml
Original file line number Diff line number Diff line change
Expand Up @@ -2,21 +2,19 @@ version: 2.1

jobs:
unit:
environment:
DBT_INVOCATION_ENV: circle
docker:
- image: fishtownanalytics/test-container:6
environment:
DBT_INVOCATION_ENV: circle
- image: fishtownanalytics/test-container:9
steps:
- checkout
- run: tox -e flake8,unit

integration-spark2:
environment:
DBT_INVOCATION_ENV: circle
docker:

- image: fishtownanalytics/test-container:6
environment:
DBT_INVOCATION_ENV: circle

- image: fishtownanalytics/test-container:9
- image: godatadriven/spark:2
environment:
WAIT_FOR: localhost:5432
Expand Down Expand Up @@ -48,9 +46,6 @@ jobs:
name: Run integration tests
command: tox -e integration-spark-thrift
no_output_timeout: 1h
environment:
DBT_PROFILES_DIR: /home/dbt_test_user/project/test/integration/

- store_artifacts:
path: ./logs

Expand Down
2 changes: 1 addition & 1 deletion dbt/adapters/spark/__version__.py
Original file line number Diff line number Diff line change
@@ -1 +1 @@
version = "0.18.0rc1"
version = "0.18.0"
12 changes: 4 additions & 8 deletions dev_requirements.txt
Original file line number Diff line number Diff line change
@@ -1,19 +1,15 @@
freezegun==0.3.9
pytest==4.4.0
pytest==6.0.2
mock>=1.3.0
flake8>=3.5.0
pytz==2017.2
bumpversion==0.5.3
tox==2.5.0
tox==3.2.0
ipdb
pytest-xdist>=1.28.0,<2
pytest-xdist>=2.1.0,<3
flaky>=3.5.3,<4

# Test requirements
behave==1.2.6
parse==1.8.4
parse-type==0.4.2
PyHamcrest==1.9.0
six>=1.14.0
pytest-dbt-adapter==0.2.0
sasl==0.2.1
thrift_sasl==0.4.1
2 changes: 1 addition & 1 deletion requirements.txt
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
dbt-core==0.18.0rc1
dbt-core==0.18.0
PyHive[hive]>=0.6.0,<0.7.0
thrift>=0.11.0,<0.12.0
4 changes: 2 additions & 2 deletions setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -28,9 +28,9 @@ def _dbt_spark_version():
package_version = _dbt_spark_version()
description = """The SparkSQL plugin for dbt (data build tool)"""

dbt_version = '0.18.0rc1'
dbt_version = '0.18.0'
# the package version should be the dbt version, with maybe some things on the
# ends of it. (0.18.0rc1 vs 0.18.0rc1a1, 0.18.0rc1.1, ...)
# ends of it. (0.18.0 vs 0.18.0a1, 0.18.0.1, ...)
if not package_version.startswith(dbt_version):
raise ValueError(
f'Invalid setup.py: package_version={package_version} must start with '
Expand Down
25 changes: 0 additions & 25 deletions test/integration/profiles.yml

This file was deleted.

43 changes: 43 additions & 0 deletions test/integration/spark-databricks.dbtspec
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
target:
type: spark
host: "{{ env_var('DBT_DATABRICKS_HOST_NAME') }}"
cluster: "{{ env_var('DBT_DATABRICKS_CLUSTER_NAME') }}"
token: "{{ env_var('DBT_DATABRICKS_TOKEN') }}"
method: http
port: 443
schema: "analytics_{{ var('_dbt_random_suffix') }}"
connect_retries: 5
connect_timeout: 60
projects:
- overrides: incremental
paths:
"models/incremental.sql":
materialized: incremental
body: "select * from {{ source('raw', 'seed') }}"
facts:
base:
rowcount: 10
extended:
rowcount: 20
- overrides: snapshot_strategy_check_cols
dbt_project_yml: &file_format_delta
# we're going to UPDATE the seed tables as part of testing, so we must make them delta format
seeds:
dbt_test_project:
file_format: delta
snapshots:
dbt_test_project:
file_format: delta
- overrides: snapshot_strategy_timestamp
dbt_project_yml: *file_format_delta
sequences:
test_dbt_empty: empty
test_dbt_base: base
test_dbt_ephemeral: ephemeral
test_dbt_incremental: incremental
test_dbt_snapshot_strategy_timestamp: snapshot_strategy_timestamp
test_dbt_snapshot_strategy_check_cols: snapshot_strategy_check_cols
test_dbt_data_test: data_test
test_dbt_ephemeral_data_tests: data_test_ephemeral_models
test_dbt_schema_test: schema_test

33 changes: 33 additions & 0 deletions test/integration/spark.dbtspec
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
target:
type: spark
host: localhost
user: dbt
method: thrift
port: 10000
connect_retries: 5
connect_timeout: 60
schema: "analytics_{{ var('_dbt_random_suffix') }}"
projects:
- overrides: incremental
paths:
"models/incremental.sql":
materialized: incremental
body: "select * from {{ source('raw', 'seed') }}"
facts:
base:
rowcount: 10
extended:
rowcount: 20
sequences:
test_dbt_empty: empty
test_dbt_base: base
test_dbt_ephemeral: ephemeral
test_dbt_incremental: incremental
# snapshots require delta format
# test_dbt_snapshot_strategy_timestamp: snapshot_strategy_timestamp
# test_dbt_snapshot_strategy_check_cols: snapshot_strategy_check_cols
test_dbt_data_test: data_test
test_dbt_schema_test: schema_test
# the local cluster currently tests on spark 2.x, which does not support this
# if we upgrade it to 3.x, we can enable this test
# test_dbt_ephemeral_data_tests: data_test_ephemeral_models
14 changes: 7 additions & 7 deletions tox.ini
Original file line number Diff line number Diff line change
Expand Up @@ -6,31 +6,31 @@ envlist = unit, flake8, integration-spark-thrift
[testenv:flake8]
basepython = python3
commands = /bin/bash -c '$(which flake8) --select=E,W,F --ignore=W504 dbt/'
passenv = DBT_INVOCATION_ENV
deps =
-r{toxinidir}/dev_requirements.txt

[testenv:unit]
basepython = python3
commands = /bin/bash -c '{envpython} -m pytest -v {posargs} test/unit'
passenv = DBT_INVOCATION_ENV
deps =
-r{toxinidir}/requirements.txt
-r{toxinidir}/dev_requirements.txt

[testenv:integration-spark-http]
[testenv:integration-spark-databricks]
basepython = python3
changedir = dbt-integration-tests
commands = /bin/bash -c '{envpython} -m behave -f progress3 --stop -D profile_name=spark-http'
passenv = SPARK_HOST SPARK_CLUSTER SPARK_TOKEN DBT_PROFILES_DIR
commands = /bin/bash -c '{envpython} -m pytest test/integration/spark-databricks.dbtspec'
passenv = DBT_DATABRICKS_HOST_NAME DBT_DATABRICKS_CLUSTER_NAME DBT_DATABRICKS_TOKEN DBT_INVOCATION_ENV
deps =
-r{toxinidir}/requirements.txt
-r{toxinidir}/dev_requirements.txt
-e.

[testenv:integration-spark-thrift]
basepython = python3
changedir = dbt-integration-tests
commands = /bin/bash -c '{envpython} -m behave -f progress3 --stop -D profile_name=spark-thrift'
passenv = DBT_PROFILES_DIR
commands = /bin/bash -c '{envpython} -m pytest test/integration/spark.dbtspec'
passenv = DBT_INVOCATION_ENV
deps =
-r{toxinidir}/requirements.txt
-r{toxinidir}/dev_requirements.txt
Expand Down