Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add target flag #13

Merged
merged 10 commits into from
Jun 1, 2022
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
- Add target flag ([issue](https://github.com/godatadriven/pytest-dbt-core/issues/11), [PR](https://github.com/godatadriven/pytest-dbt-core/pull/13))
- Delete session module [is included in dbt-spark](https://github.com/dbt-labs/dbt-spark/issues/272)
- Add Github templates

Expand Down
10 changes: 6 additions & 4 deletions setup.cfg
Original file line number Diff line number Diff line change
Expand Up @@ -21,15 +21,16 @@ classifiers =
packages = find:
package_dir = =src
install_requires =
dbt-spark>=1.0.0,<2.0.0
pyspark>=3.0.0,<4.0.0
dbt-core>=1.0.0,<2.0.0
python_requires = >=3.6

[options.packages.find]
where = src

[options.extras_require]
test =
dbt-spark[ODBC]>=1.0.0,<2.0.0
pyspark>=3.0.0,<4.0.0
pre-commit>=2.14.1
pytest>=6.2.5
pytest-spark>=0.6.0
Expand Down Expand Up @@ -57,10 +58,11 @@ warn_redundant_casts = True
addopts = --cov=src
--cov-report=xml:pytest-coverage.xml
--junitxml=pytest-output.xml
--doctest-glob="README.md"
--doctest-glob=README.md
--doctest-modules
--ignore=scripts/
--dbt-project-dir="./tests/dbt_project"
--dbt-project-dir=./tests/dbt_project
--dbt-target=test
spark_options =
spark.app.name: dbt-core
spark.executor.instances: 1
Expand Down
12 changes: 10 additions & 2 deletions src/pytest_dbt_core/fixtures.py
Original file line number Diff line number Diff line change
Expand Up @@ -33,9 +33,14 @@ class Args:
dbt is written as command line tool, therefore the entrypoints of dbt expect
(parsed) arguments. To reuse dbt's entrypoints we mock the (minimally)
arguments here.

Source
------
See argparse `add_argument` statements in `dbt.main`.
"""

project_dir: str
target: str | None


@pytest.fixture
Expand All @@ -53,8 +58,11 @@ def config(request: SubRequest) -> RuntimeConfig:
RuntimeConfig
The runtime config.
"""
project_dir = request.config.getoption("--dbt-project-dir")
config = RuntimeConfig.from_args(Args(project_dir=project_dir))
args = Args(
project_dir=request.config.getoption("--dbt-project-dir"),
target=request.config.getoption("--dbt-target"),
)
config = RuntimeConfig.from_args(args)
return config


Expand Down
7 changes: 6 additions & 1 deletion src/pytest_dbt_core/plugin.py
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@

def pytest_addoption(parser: Parser) -> None:
"""
Add pytest option.
Add pytest options.

Parameters
----------
Expand All @@ -29,3 +29,8 @@ def pytest_addoption(parser: Parser) -> None:
type="string",
default=os.getcwd(),
)
parser.addoption(
"--dbt-target",
help="Which target to load for the given profile",
type="string",
)
12 changes: 11 additions & 1 deletion tests/dbt_project/profiles.yml
Original file line number Diff line number Diff line change
@@ -1,6 +1,16 @@
dbt_project:
target: test
target: dev
outputs:
dev:
type: spark
method: odbc
schema: cor
host: https://adb-123456789.00.azuredatabricks.net
port: 443
organization: "123456789"
cluster: 1234-567890-12ab3c4d
token: dapi1abc2345de6f78g901h234ij5klm6789-1
driver: /Library/simba/spark/lib/libsparkodbc_sbu.dylib
test:
type: spark
method: session
Expand Down
5 changes: 1 addition & 4 deletions tests/dbt_project/tests/test_fetch_single_statement.py
Original file line number Diff line number Diff line change
@@ -1,15 +1,12 @@
import pytest
from dbt.clients.jinja import MacroGenerator
from pyspark.sql import SparkSession


@pytest.mark.parametrize(
"macro_generator",
["macro.dbt_project.fetch_single_statement"],
indirect=True,
)
def test_create_table(
spark_session: SparkSession, macro_generator: MacroGenerator
) -> None:
def test_create_table(macro_generator: MacroGenerator) -> None:
out = macro_generator("SELECT 1")
assert out == 1