From bd434be7f107e9cdc47c10d8de2b4419d96cb855 Mon Sep 17 00:00:00 2001 From: enya-yx Date: Thu, 19 Jan 2023 11:39:59 +0800 Subject: [PATCH] Add docs about registry server test and label --- docs/dev_guide/cloud_integration_testing.md | 5 +++-- docs/dev_guide/new_contributor_guide.md | 2 +- docs/dev_guide/test_coverage_guide.md | 2 +- 3 files changed, 5 insertions(+), 4 deletions(-) diff --git a/docs/dev_guide/cloud_integration_testing.md b/docs/dev_guide/cloud_integration_testing.md index ed558d6c2..d71008766 100644 --- a/docs/dev_guide/cloud_integration_testing.md +++ b/docs/dev_guide/cloud_integration_testing.md @@ -5,14 +5,15 @@ parent: Developer Guides --- # Cloud Integration Test/CI Pipeline -We use [GitHub Actions](https://github.com/feathr-ai/feathr/tree/main/.github/workflows) to do cloud integration test. Currently the integration test has 4 jobs: +We use [GitHub Actions](https://github.com/feathr-ai/feathr/tree/main/.github/workflows) to do cloud integration test. Currently the integration test has 5 jobs: - running `./gradlew test` to verify if the scala/spark related code has passed all the test - running `flake8` to lint python scripts and make sure there are no obvious syntax errors - running the built jar in databricks environment with end to end test to make sure it passed the end to end test - running the built jar in Azure Synapse environment with end to end test to make sure it passed the end to end test +- running the end to end test cases for registry server to make sure related code can passed all the tests -The above 4 jobs will ran in parallel, and if any one of them fails, the integration test will fail. +The above 5 jobs will ran in parallel, and if any one of them fails, the integration test will fail. ## Cloud Testing Pipelines diff --git a/docs/dev_guide/new_contributor_guide.md b/docs/dev_guide/new_contributor_guide.md index 223b7d91b..109f7bbc9 100644 --- a/docs/dev_guide/new_contributor_guide.md +++ b/docs/dev_guide/new_contributor_guide.md @@ -33,7 +33,7 @@ You can ping your questions in the community Slack channel and Feathr developers * "I am not sure how to setup the cluster and test my code. Can someone help me out?" ## My pull request(PR) requires testing against the database or cluster that I dont' have, how can I test? -Develop your implementation locally first and use unit tests to ensure correctness. Later, you can ask PR reviewers to label them with `safe to test` so Github will kick off a integration test in our test cluster with your code. +Develop your implementation locally first and use unit tests to ensure correctness. Later, you can ask PR reviewers to label them with `safe to test` so Github will kick off a integration test in our test cluster with your code. If your PR contains any change about registry server it should also be labeled with `registry test`. Then related test cases will be triggered in Github workflow. If you need more assistance regarding testing your code or development, reach out in our Slack channel. diff --git a/docs/dev_guide/test_coverage_guide.md b/docs/dev_guide/test_coverage_guide.md index 54ef20979..37dd5f14b 100644 --- a/docs/dev_guide/test_coverage_guide.md +++ b/docs/dev_guide/test_coverage_guide.md @@ -11,7 +11,7 @@ To maintain and improve codes quality of feathr, we expect test coverage ratio t ## How to conduct test coverage ### Through github workflows pipeline: - We already added this coverage checking into our CI pipeline. For every pull request, push and scheduled jobs, github will check the coverage when runing 'pytest' automatically. You can find the result for 'azure_synapse', 'databricks' and 'local spark', respectively from each PR and commit. + We already added this coverage checking into our CI pipeline. For every pull request, push and scheduled jobs, github will check the coverage when runing 'pytest' automatically. You can find the result for 'azure_synapse', 'databricks', 'local spark' and 'registry_test', respectively from each PR and commit. An example of test coverage result: ![test coverage example](./images/coverage_res.png)