-
Notifications
You must be signed in to change notification settings - Fork 54
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: add a CLI tool to validate generation configuration #2691
Conversation
@@ -130,10 +130,6 @@ def prepare_repo( | |||
# use absolute path because docker requires absolute path | |||
# in volume name. | |||
absolute_library_path = str(Path(library_path).resolve()) | |||
if absolute_library_path in libraries: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We don't need this check because the Generation object is validated when reaches this point.
library_name = library.get_library_name() | ||
if library_name in seen_library_names: | ||
raise ValueError( | ||
f"{library.name_pretty} has the same library name with " |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we make the error message more verbose and actionable? Something like
f"Both {library.name_pretty} and {seen_library_names.get(library_name)} have the same library_name:
{library_name}, please update one of the library to have a different library_name."
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Another edge case is that two libraries may have the same name_pretty, but I think the error message is already good enough, so we may not have to consider it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done.
@@ -130,7 +142,9 @@ def from_yaml(path_to_yaml: str) -> GenerationConfig: | |||
|
|||
def __required(config: Dict, key: str): | |||
if key not in config: | |||
raise ValueError(f"required key {key} not found in yaml") | |||
raise ValueError( | |||
f"required key {key} not found in {config} " f"when parsing yaml" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is {config} going to print out the whole config? If yes, I don't think we want to do that. Something like
f"Required repo level config {key} not found"
should be good enough.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Separately, there are a few library level required field, and I don't see any validation for them yet. They can be addressed in a separate PR.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I remove config
in the err message.
Separately, there are a few library level required field, and I don't see any validation for them yet. They can be addressed in a separate PR.
Library level required fields are tests in utilities_unit_tests.py
. I moved it to generation_config_unit_test.py
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I see that library level config is validated in the same way as repo level config now. However, it makes the error message not clear for library level config. e.g. If api-shortname
is missing for a specific library, we don't know which library is missing api-shortname
from required key {key} not found
.
How do we plan to use it? I think we can add it to the CI as a required check for every PR, as this should be a very lightweight check. |
Yes, the check should be finished quickly. I tested with generation_config.yaml in google-cloud-java and it finished in 0.3s:
I added how to use the check in the description. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can the command just be
docker run ...
python /src/cli/entry_point.py validate-generation-config
without specifying the yaml location? I think it should be fine since we have a default location?
# be raised. | ||
@parameterized.expand( | ||
[ | ||
("libraries", f"{test_config_dir}/config_without_libraries.yaml"), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm still not a fan of parameterized tests, but I know this test was pre-existing, and in this case, the name of test yamls are kind of self-explanatory, so it's fine.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I replaced this parameterized test to several individual tests.
@@ -130,7 +142,9 @@ def from_yaml(path_to_yaml: str) -> GenerationConfig: | |||
|
|||
def __required(config: Dict, key: str): | |||
if key not in config: | |||
raise ValueError(f"required key {key} not found in yaml") | |||
raise ValueError( | |||
f"required key {key} not found in {config} " f"when parsing yaml" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I see that library level config is validated in the same way as repo level config now. However, it makes the error message not clear for library level config. e.g. If api-shortname
is missing for a specific library, we don't know which library is missing api-shortname
from required key {key} not found
.
Yes, if the yaml location is not specified, the default value is |
I added a parameter, |
The main concern is not that we don't know if the config is at library or repo level, the concern is that we don't know which library is missing a config if it's a library level config. |
When parsing from yaml, we don't know which parameter maybe missing so I think the best we can do is printing the library dict and hopefully some of the values can let us find which library is missing parameter. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM other than one minor issue for unit test. Separately I think we should error out on unknown parameters, which I don't think we are doing currently.
I'll address this issue in a follow-up PR. |
Quality Gate passed for 'gapic-generator-java-root'Issues Measures |
Please retry analysis of this Pull-Request directly on SonarCloud |
Quality Gate passed for 'java_showcase_integration_tests'Issues Measures |
🤖 I have created a release *beep* *boop* --- <details><summary>2.40.0</summary> ## [2.40.0](v2.39.0...v2.40.0) (2024-05-02) ### Features * [common-protos] add `Weight` to common types for Shopping APIs to be used for accounts bundle ([#2699](#2699)) ([5bb9770](5bb9770)) * add a CLI tool to validate generation configuration ([#2691](#2691)) ([f2ce524](f2ce524)) * Parser to consume the api-versioning value from proto ([#2630](#2630)) ([40711fd](40711fd)) * Update Gapic generator and Gax to emit api-versioning via header ([#2671](#2671)) ([e63d1b4](e63d1b4)) ### Bug Fixes * change folder prefix for adding headers ([#2688](#2688)) ([4e92be8](4e92be8)) * Log HttpJson's async thread pool core size ([#2697](#2697)) ([34b4bc3](34b4bc3)) * replace `cfg = "host"` with `cfg = "exec"` ([#2637](#2637)) ([6d673f3](6d673f3)) * Return resolved endpoint from StubSettings' Builder ([#2715](#2715)) ([32c9995](32c9995)) ### Dependencies * Make opentelemetry-api an optional dependency. ([#2681](#2681)) ([3967a19](3967a19)) * update dependency absl-py to v2.1.0 ([#2659](#2659)) ([cae6d79](cae6d79)) * update dependency gitpython to v3.1.43 ([#2656](#2656)) ([208bef4](208bef4)) * update dependency lxml to v5.2.1 ([#2661](#2661)) ([b95ad49](b95ad49)) * update dependency net.bytebuddy:byte-buddy to v1.14.14 ([#2703](#2703)) ([87069bc](87069bc)) * update dependency typing to v3.10.0.0 ([#2663](#2663)) ([7fb5653](7fb5653)) * update gapic-showcase to v0.33.0 ([#2653](#2653)) ([0a71cbf](0a71cbf)) ### Documentation * Add contributing guidelines to PR and issue templates ([#2682](#2682)) ([42526dc](42526dc)) </details> --- This PR was generated with [Release Please](https://github.com/googleapis/release-please). See [documentation](https://github.com/googleapis/release-please#release-please). Co-authored-by: release-please[bot] <55107282+release-please[bot]@users.noreply.github.com>
In this PR: - Add a CLI tool to validate generation config. Downstream libraries, e.g., google-cloud-java, can write a workflow job like: ``` docker run ... python /src/cli/entry_point.py \ validate-generation-config \ --generation-config-path=path/to/generation_config.yaml ```
🤖 I have created a release *beep* *boop* --- <details><summary>2.40.0</summary> ## [2.40.0](v2.39.0...v2.40.0) (2024-05-02) ### Features * [common-protos] add `Weight` to common types for Shopping APIs to be used for accounts bundle ([#2699](#2699)) ([5bb9770](5bb9770)) * add a CLI tool to validate generation configuration ([#2691](#2691)) ([f2ce524](f2ce524)) * Parser to consume the api-versioning value from proto ([#2630](#2630)) ([40711fd](40711fd)) * Update Gapic generator and Gax to emit api-versioning via header ([#2671](#2671)) ([e63d1b4](e63d1b4)) ### Bug Fixes * change folder prefix for adding headers ([#2688](#2688)) ([4e92be8](4e92be8)) * Log HttpJson's async thread pool core size ([#2697](#2697)) ([34b4bc3](34b4bc3)) * replace `cfg = "host"` with `cfg = "exec"` ([#2637](#2637)) ([6d673f3](6d673f3)) * Return resolved endpoint from StubSettings' Builder ([#2715](#2715)) ([32c9995](32c9995)) ### Dependencies * Make opentelemetry-api an optional dependency. ([#2681](#2681)) ([3967a19](3967a19)) * update dependency absl-py to v2.1.0 ([#2659](#2659)) ([cae6d79](cae6d79)) * update dependency gitpython to v3.1.43 ([#2656](#2656)) ([208bef4](208bef4)) * update dependency lxml to v5.2.1 ([#2661](#2661)) ([b95ad49](b95ad49)) * update dependency net.bytebuddy:byte-buddy to v1.14.14 ([#2703](#2703)) ([87069bc](87069bc)) * update dependency typing to v3.10.0.0 ([#2663](#2663)) ([7fb5653](7fb5653)) * update gapic-showcase to v0.33.0 ([#2653](#2653)) ([0a71cbf](0a71cbf)) ### Documentation * Add contributing guidelines to PR and issue templates ([#2682](#2682)) ([42526dc](42526dc)) </details> --- This PR was generated with [Release Please](https://github.com/googleapis/release-please). See [documentation](https://github.com/googleapis/release-please#release-please). Co-authored-by: release-please[bot] <55107282+release-please[bot]@users.noreply.github.com>
In this PR:
Downstream libraries, e.g., google-cloud-java, can write a workflow job like: