Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

CI: Notify when CircleCI build failures for master and rel branches only #4789

Merged
merged 1 commit into from
Nov 16, 2022

Conversation

michaeldiamant
Copy link
Contributor

Suggests the following build notification failure policy modifications: Notify when release + master branches fail.

Rationale:

  • Merges to master sometimes fail the build. The failure is silent until the nightly build happens.
  • Prior to the PR, ad-hoc full builds triggered by a user result in a failure notification. The PR deliberately precludes custom branches from generating failure notifications.
    • If we wish to retain notifications here, I propose adding a secondary channel out-of-band to the PR.

Notes:

@codecov
Copy link

codecov bot commented Nov 14, 2022

Codecov Report

Merging #4789 (fc1e458) into master (1a13168) will decrease coverage by 0.02%.
The diff coverage is n/a.

@@            Coverage Diff             @@
##           master    #4789      +/-   ##
==========================================
- Coverage   54.71%   54.69%   -0.03%     
==========================================
  Files         414      414              
  Lines       53548    53548              
==========================================
- Hits        29299    29286      -13     
- Misses      21825    21835      +10     
- Partials     2424     2427       +3     
Impacted Files Coverage Δ
crypto/merkletrie/trie.go 66.42% <0.00%> (-2.19%) ⬇️
network/wsPeer.go 66.50% <0.00%> (-1.95%) ⬇️
crypto/merkletrie/node.go 91.62% <0.00%> (-1.87%) ⬇️
ledger/tracker.go 77.87% <0.00%> (-0.86%) ⬇️
ledger/acctonline.go 78.12% <0.00%> (-0.53%) ⬇️
network/wsNetwork.go 65.62% <0.00%> (-0.19%) ⬇️
ledger/accountdb.go 72.55% <0.00%> (-0.08%) ⬇️
data/transactions/verify/txn.go 76.19% <0.00%> (ø)
ledger/acctupdates.go 69.51% <0.00%> (+0.24%) ⬆️
catchup/service.go 69.62% <0.00%> (+0.74%) ⬆️
... and 1 more

📣 We’re building smart automated test selection to slash your CI/CD build times. Learn more

@michaeldiamant
Copy link
Contributor Author

#4790 illustrates notification on failure test. See the notification channel at ~5:15PM ET for failure notifications from branch = notify_on_failure_test.

@michaeldiamant michaeldiamant changed the title Notify about build failures for master and rel branches only CI: Notify about build failures for master and rel branches only Nov 14, 2022
@michaeldiamant michaeldiamant changed the title CI: Notify about build failures for master and rel branches only CI: Notify about CircleCI build failures for master and rel branches only Nov 14, 2022
@michaeldiamant michaeldiamant changed the title CI: Notify about CircleCI build failures for master and rel branches only CI: Notify when CircleCI build failures for master and rel branches only Nov 14, 2022
@winder
Copy link
Contributor

winder commented Nov 14, 2022

You're adding slack contexts to more jobs. Is that going to make it more difficult to run nightly tests against forks?

@michaeldiamant
Copy link
Contributor Author

Is that going to make it more difficult to run nightly tests against forks?

@winder Without testing it, it's unclear to me. Depends on when the filter clause is invoked.

  • At worst, it means commenting out additional secrets.
  • At best, it means no change to the existing workflow.

From my perspective, the PR remains worth pursuing even in the worst case scenario.

@gmalouf
Copy link
Contributor

gmalouf commented Nov 15, 2022

You're adding slack contexts to more jobs. Is that going to make it more difficult to run nightly tests against forks?

@winder wouldn't they not run if outside of master or rel branches?

@gmalouf
Copy link
Contributor

gmalouf commented Nov 15, 2022

#4790 illustrates notification on failure test. See the notification channel at ~5:15PM ET for failure notifications from branch = notify_on_failure_test.

Looked spammy in the dev-alerts channel - has that been addressed/corrected (as-is, did not seem useful)

@michaeldiamant
Copy link
Contributor Author

Looked spammy

@gmalouf No - no attempt made to modify behavior.

  • It's not a change in behavior from the nightly build.
    • If a nightly build fails to compile, it generates multiple notifications. The likelihood of merging a PR with compilation issues feels slim (happened 0 times in ~10 months).
    • I agree if a build error happens that it's too noisy. Sadly, I don't see an alternative with the toolset CircleCI provides. I'd be happy to be proven wrong.
  • I think 1 way to improve behavior is to move notifications to GHA as discussed in PR description. I can ticket the concern though I don't see a reason to hold here.

@cce
Copy link
Contributor

cce commented Nov 16, 2022

If you only want one notification on failure, you should be able to follow this advice and make a fan-in job that requires all the other jobs: https://support.circleci.com/hc/en-us/articles/360047082992-How-to-send-a-slack-notification-at-end-of-workflow

However if your workflow jobs don't run sequentially, you will need to create a new job that contains the slack notification and require that the job has run all the previous jobs.

@michaeldiamant
Copy link
Contributor Author

make a fan-in job that requires all the other jobs:

@cce I had considered that article while working on the PR and concluded the approach only works for success notifications. Admittedly, I didn't try the approach, but looking at a sample failed workflow (https://app.circleci.com/pipelines/github/algorand/go-algorand/10598/workflows/95522810-15f8-4f97-8dc2-9c4a8e421535), we see jobs downstream from failed jobs (e.g. arm64_test_verification) are not invoked.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants