Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fire skipped events at debug level #10244

Merged
merged 9 commits into from
Jun 6, 2024
Merged

Conversation

nevdelap
Copy link
Contributor

@nevdelap nevdelap commented May 30, 2024

Resolves #8774. Picking up from Scott Gigantes #9371. Scott's not had the time to finish it off and has suggested I take a look.

TODO

  • Set up my dev env to reproduce the issue using this clone locally installed, and see it fixed with the patch. ✔️
  • Learn about contributing to this project. ✔️
  • Resolve the issues described in the PR. ✔️

Problem

If fail_fast, dbt prints hundreds of lines "Skipping due to fail_fast", hiding the actual cause of the error deep in bash history (sometimes too deep to read)

Solution

This moves the "Skipping due to fail_fast" to log level DEBUG. An alternative would be to a) drop it altogether or b) aggregate the results together to a "Skipping tasks due to fail_fast". This approach requires the least modification to the task results framework.

Checklist

  • I have read the contributing guide and understand what's expected of me
  • I have run this code in development and it appears to resolve the stated issue
  • This PR includes tests, or tests are not required/relevant for this PR
  • This PR has no interface changes (e.g. macros, cli, logs, json artifacts, config files, adapter interface, etc) or this PR has already received feedback and approval from Product or DX
  • This PR includes type annotations for new and modified functions

Copy link

cla-bot bot commented May 30, 2024

Thanks for your pull request, and welcome to our community! We require contributors to sign our Contributor License Agreement and we don't seem to have your signature on file. Check out this article for more information on why we have a CLA.

In order for us to review and merge your code, please submit the Individual Contributor License Agreement form attached above above. If you have questions about the CLA, or if you believe you've received this message in error, please reach out through a comment on this PR.

CLA has not been signed by users: @nevdelap

Copy link

cla-bot bot commented May 30, 2024

Thanks for your pull request, and welcome to our community! We require contributors to sign our Contributor License Agreement and we don't seem to have your signature on file. Check out this article for more information on why we have a CLA.

In order for us to review and merge your code, please submit the Individual Contributor License Agreement form attached above above. If you have questions about the CLA, or if you believe you've received this message in error, please reach out through a comment on this PR.

CLA has not been signed by users: @nevdelap

@nevdelap nevdelap changed the title Issue Fire skipped events at debug level May 30, 2024
@QMalcolm
Copy link
Contributor

@nevdelap Let me know if you need any guidance or want to pair to get this across the line 🙂

@nevdelap
Copy link
Contributor Author

nevdelap commented May 31, 2024

Thanks @QMalcolm. I expect I'll do some more on Monday. I see your link to where the tests are likely to go. If I have any trouble I'll ping you. 👍

CLA signed.

Copy link

codecov bot commented Jun 3, 2024

Codecov Report

Attention: Patch coverage is 44.44444% with 5 lines in your changes missing coverage. Please review.

Project coverage is 63.22%. Comparing base (a677abd) to head (28ac6d7).
Report is 2 commits behind head on main.

Files Patch % Lines
core/dbt/task/printer.py 44.44% 5 Missing ⚠️
Additional details and impacted files
@@             Coverage Diff             @@
##             main   #10244       +/-   ##
===========================================
- Coverage   88.72%   63.22%   -25.51%     
===========================================
  Files         180      180               
  Lines       22463    22470        +7     
===========================================
- Hits        19931    14207     -5724     
- Misses       2532     8263     +5731     
Flag Coverage Δ
integration ?
unit 63.22% <44.44%> (-0.02%) ⬇️

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@QMalcolm
Copy link
Contributor

QMalcolm commented Jun 3, 2024

Thanks for signing the CLA - @cla-bot check

@cla-bot cla-bot bot added the cla:yes label Jun 3, 2024
Copy link

cla-bot bot commented Jun 3, 2024

The cla-bot has been summoned, and re-checked this pull request!

@QMalcolm
Copy link
Contributor

QMalcolm commented Jun 3, 2024

For documentation purposes:

As mentioned on the PR by @scottgigante (#9371), what we need to do before merging this is add a test for the added functionality to ensure we don't regress in the future.

In the test(s) we should:

  • assert that Skipping due to fail_fast doesn't appear in the output if --fail-fast is specified but --debug is not
  • assert that Skipping due to fail_fast does appear in the ouput if both --fail-fast and --debug are specified

Here are some examples of tests that something exists in the run output. Additionally a good place for the test would be in test_fail_fast_run.py

@nevdelap
Copy link
Contributor Author

nevdelap commented Jun 3, 2024

@QMalcolm I've added a couple of tests. Looking at how and where to put any tests in test_fail_fast_run.py. I think it doesn't invoke the cli in a way that captures what ends up in stdout because it uses dbtRunner rather than CliRunner, and so might perhaps need more involved changes to expose its output. So maybe the tests that I've added are sufficient? Let me know if you'd like me to look harder into making test_fail_fast_run.py able to see what it needs to see to test this. 🙏

@nevdelap nevdelap marked this pull request as ready for review June 3, 2024 23:37
@nevdelap nevdelap requested a review from a team as a code owner June 3, 2024 23:37
@github-actions github-actions bot added the community This PR is from a community member label Jun 3, 2024
@nevdelap
Copy link
Contributor Author

nevdelap commented Jun 4, 2024

I see Test Log Schema tests failed because they're doing something slightly different than running the test by itself does, so I'll look at that. Done

The Integration Tests against the different Python versions on different platforms failed for dependency version differences that are unrelated to these changes.

# 1 snapshot
assert "SKIP=1" in result.output
# Skipping due to fail_fast is shown when --debug is specified.
assert result.output.count("Skipping due to fail_fast") == 1
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'll change this to a plain 'in' test.

Copy link
Contributor

@QMalcolm QMalcolm left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for doing this work, it looks great 🙂

@scottgigante
Copy link
Contributor

Thanks @nevdelap for picking up where I left off!

@scottgigante
Copy link
Contributor

Closes #9371

@QMalcolm
Copy link
Contributor

QMalcolm commented Jun 6, 2024

Our codecov check is complaining. However, it's results don't really make sense. I'm gonna move forward with getting this merged.

@QMalcolm QMalcolm merged commit 88ccc8a into dbt-labs:main Jun 6, 2024
56 of 57 checks passed
@nevdelap
Copy link
Contributor Author

nevdelap commented Jun 6, 2024

Thanks @scottgigante, thanks @QMalcolm!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
cla:yes community This PR is from a community member
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[CT-3185] Hundreds of: Skipping due to fail_fast
3 participants