Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature/dbt build #18

Open
wants to merge 18 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 5 additions & 5 deletions .github/workflows/dbt_deploy.yml
Original file line number Diff line number Diff line change
Expand Up @@ -27,11 +27,11 @@ jobs:
# TODO: update your GitHub secrets to include AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY
- name: Grab production manifest from S3
run: |
aws s3 cp s3://vhol-datafold-dbt-prod-manifest/manifest.json ./manifest.json
aws s3 cp s3://vhol-prod-manifest-demo/manifest.json ./manifest.json
env:
AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
AWS_REGION: us-west-2
AWS_REGION: us-east-2

- name: dbt build
run: dbt build --target prod --select state:modified+ --defer --state ./ --exclude config.materialized:snapshot --full-refresh --profiles-dir ./
Expand All @@ -46,7 +46,7 @@ jobs:
- name: submit artifacts to datafold
run: |
set -ex
datafold dbt upload --ci-config-id 345 --run-type ${DATAFOLD_RUN_TYPE} --commit-sha ${GIT_SHA}
datafold dbt upload --ci-config-id 421 --run-type ${DATAFOLD_RUN_TYPE} --commit-sha ${GIT_SHA}
env: # TODO: update your GitHub secrets to include DATAFOLD_APIKEY
DATAFOLD_APIKEY: ${{ secrets.DATAFOLD_APIKEY }}
DATAFOLD_RUN_TYPE: "${{ 'production' }}"
Expand All @@ -57,8 +57,8 @@ jobs:
# TODO: update your GitHub secrets to include AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY
- name: Upload manifest to S3
run: |
aws s3 cp target/manifest.json s3://vhol-datafold-dbt-prod-manifest/manifest.json
aws s3 cp target/manifest.json s3://vhol-prod-manifest-demo/manifest.json
env:
AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
AWS_REGION: us-west-2
AWS_REGION: us-east-2
4 changes: 2 additions & 2 deletions .github/workflows/dbt_staging.yml
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@ jobs:
# TODO: update your GitHub secrets to include AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY
- name: Grab production manifest from S3
run: |
aws s3 cp s3://vhol-datafold-dbt-prod-manifest/manifest.json ./manifest.json
aws s3 cp s3://vhol-prod-manifest-demo/manifest.json ./manifest.json
env:
AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
Expand All @@ -60,7 +60,7 @@ jobs:
- name: submit artifacts to datafold
run: |
set -ex
datafold dbt upload --ci-config-id 345 --run-type ${DATAFOLD_RUN_TYPE} --commit-sha ${GIT_SHA}
datafold dbt upload --ci-config-id 421 --run-type ${DATAFOLD_RUN_TYPE} --commit-sha ${GIT_SHA}
env: # TODO: update your GitHub secrets to include DATAFOLD_APIKEY
DATAFOLD_APIKEY: ${{ secrets.DATAFOLD_APIKEY }}
DATAFOLD_RUN_TYPE: "${{ 'pull_request' }}"
Expand Down
4 changes: 3 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -162,4 +162,6 @@ cython_debug/

target/
dbt_packages/
logs/
logs/

.DS_Store
10 changes: 6 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ You should be able to run this self-contained example, use the `yml` workflow fi

If you have questions unique to your tech stack, schedule a call with us at: [Datafold](https://www.datafold.com/)

## What does success look like?
## What does success look like??

- [ ] Slim CI pipeline that runs and tests only the models that have changed and their downstream models in your pull requests
- [ ] Automated Datafold data diffs in your pull requests if you have a Datafold account
Expand Down Expand Up @@ -49,13 +49,13 @@ source venv/bin/activate
"Sid": "ListObjectsInBucket",
"Effect": "Allow",
"Action": ["s3:ListBucket"],
"Resource": ["arn:aws:s3:::vhol-datafold-dbt-prod-manifest"] # TODO: replace with your own bucket name
"Resource": ["arn:aws:s3:::vhol-prod-manifest-demo"] # TODO: replace with your own bucket name
},
{
"Sid": "AllObjectActions",
"Effect": "Allow",
"Action": "s3:*Object",
"Resource": ["arn:aws:s3:::vhol-datafold-dbt-prod-manifest/*"] # TODO: replace with your own bucket name
"Resource": ["arn:aws:s3:::vhol-prod-manifest-demo/*"] # TODO: replace with your own bucket name
}
]
}
Expand All @@ -74,6 +74,8 @@ Please replace the placeholder with your own bucket name for the policy to work
</details>

3. Set up your GitHub secrets
3a. Install protobuf: python3 -m pip install protobuf==4.25.3
3a. Install dbt-snowflake adapter

4. Run the below to build your dbt project on the `main` branch

Expand All @@ -88,7 +90,7 @@ dbt build --target prod
6. Run the below to switch branches and test your CI pipelines

```bash
git checkout -b feature/your-feature
git checkout -b feature/sql-change
```

7. Make some changes to your dbt models and push to your remote repository. You can copy and paste the examples here and run it locally: [example_dbt_changes](example_dbt_changes/)
Expand Down
2 changes: 1 addition & 1 deletion dbt_project.yml
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ vars:
data_diff:
prod_database: DEMO
prod_schema: CORE
datasource_id: 4932
datasource_id: 8350

# Configuring models
# Full documentation: https://docs.getdbt.com/docs/configuring-models
Expand Down
2 changes: 1 addition & 1 deletion models/core/dim_orgs.sql
Original file line number Diff line number Diff line change
Expand Up @@ -42,4 +42,4 @@ SELECT
, sub_price
FROM orgs
LEFT JOIN user_count USING (org_id)
LEFT JOIN subscriptions USING (org_id)
LEFT JOIN subscriptions USING (org_id)
8 changes: 4 additions & 4 deletions profiles.yml
Original file line number Diff line number Diff line change
Expand Up @@ -7,9 +7,9 @@ vhol_demo:
account: "{{ env_var('SNOWFLAKE_ACCOUNT') }}"
user: "{{ env_var('SNOWFLAKE_USER') | as_text }}"
password: "{{ env_var('SNOWFLAKE_PASSWORD') | as_text }}"
role: DEMO_ROLE
role: DATAFOLDROLE
database: DEMO
warehouse: INTEGRATION
warehouse: COMPUTE_WH
schema: "{{ env_var('SNOWFLAKE_SCHEMA') | as_text }}"
threads: 24

Expand All @@ -18,8 +18,8 @@ vhol_demo:
account: "{{ env_var('SNOWFLAKE_ACCOUNT') }}"
user: "{{ env_var('SNOWFLAKE_USER') | as_text }}"
password: "{{ env_var('SNOWFLAKE_PASSWORD') | as_text }}"
role: DEMO_ROLE
role: DATAFOLDROLE
database: DEMO
warehouse: INTEGRATION
warehouse: COMPUTE_WH
schema: CORE
threads: 24
3 changes: 2 additions & 1 deletion requirements.txt
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
dbt-core==1.7.7
dbt-snowflake==1.7.1
data-diff==0.11.0
datafold-sdk==0.0.19
datafold-sdk==0.0.19
protobuf==4.25.3