-
Notifications
You must be signed in to change notification settings - Fork 543
Add GitHub workflow for existing integration tests #1142
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add GitHub workflow for existing integration tests #1142
Conversation
f499195 to
318a64d
Compare
|
@tanvigour @artursouza Given that this set of tests is sitting in a slightly broken state, Dapr should make a decision on whether to remove/replace them completely, or have them running in automation so that they are not quietly getting out of sync with the code. |
318a64d to
7f386b3
Compare
artursouza
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Very important step towards automating integration tests. Thanks.
| docker-compose -f ./.github/infrastructure/docker-compose-mysql.yml -p mysql up -d | ||
| if: contains(matrix.component, 'mysql') | ||
|
|
||
| - name: Start postgresql |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is there a way that this setup runs based on each test? Like, using the test matrix to get the setup script instead of hardcoding each one here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Definitely possible, though this PR primarily illustrates the viability of having existing tests running in automation cheaply by reusing conformance test assets, with the hope that they don't get even more broken as certification efforts move forwards. I think the key questions for taking this PR are:
- Do we want to enable these existing tests in automation?
- It's a small set of existing tests and Dapr could decide that these should all be removed and re-written, for example, using the POC daprd-integrated framework Phil is developing.
- If keeping these tests:
- Do we want to rename all of the tests (e.g. to
*_functional_tests) to avoid confusion with other test suites Dapr wants to develop? - Do we want to encourage adding more tests to this workflow moving forwards?
- This requires additional design in terms of what the goals are for this suite vs. conformance tests and tests to be run under other frameworks (e.g. E2E, daprd-integrated ...)
- Based on functional test gap analysis, I would suggest:
- This may be a good category for component-specific functional tests requiring flexible configuration.
- The necessary accommodations will not be limited to a single setup script. Some components will require workflow changes (e.g. provisioning multiple services, provisioning multiple applications for a single test), so if we're designing abstractions over the tests, it would be useful to have the tests and refactor the workflow around them.
- It is likely that the necessary accommodations will also benefit maintainability of the existing conformance workflow tests (e.g. EventGrid tests and extra provisioning of ngrok service), so would be useful to design in that broader context.
- Do we want to rename all of the tests (e.g. to
|
I like this strategy. Although, @pkedy is making great progress on integration tests framework. So, it is not clear how long we will keep this. |
|
@pkedy & @tanvigour Let me know if we care to preserve the existing tests, otherwise we should take a different PR to delete the broken tests and replace with new tests written under the integration test framework. |
halspang
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good overall. I like using the existing stuff while we work towards the unified framework. Just a few quick comments.
| - name: Specify components requiring secrets or certs | ||
| id: cron-components |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is there a reason we name this cron-components? From what I understand it seems like it runs at all times regardless and this just has to do with secrets. Is that correct?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This naming is copied verbatim from conformance.yml: my understanding is that it's trying to distinguish which components run when triggered by PR (pr-components) and which ones only run on the cron trigger (cron-components i.e. any test that uses the pre-provisioned Azure test resource pool triggered by line 11)
| return receivedMessage, receivedMessage != nil, nil | ||
| } | ||
|
|
||
| func TestQueueWithTTL(t *testing.T) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I wasn't able to get this to pass without changing the ttl. It seemed like the timing was too tight. Also, the docs don't recommend a ttl under 5 seconds. When I changed the ttl locally to 5 seconds I could get these to pass consistently.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
7f386b3 to
f47d3ae
Compare
f47d3ae to
d3876c8
Compare
Codecov Report
@@ Coverage Diff @@
## master #1142 +/- ##
==========================================
- Coverage 34.34% 34.33% -0.01%
==========================================
Files 140 140
Lines 11993 11995 +2
==========================================
Hits 4119 4119
- Misses 7450 7452 +2
Partials 424 424
Continue to review full report at Codecov.
|
|
Closing in favor of #1204 |
Description
POC to illustrate running existing integration tests in automation via GitHub workflow. Reuses the assets already in place for conformance tests with a couple of caveats:
Also fix existing integration tests:
Issue reference
Exploration for #939
Checklist
Please make sure you've completed the relevant tasks for this PR, out of the following list: