Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Configure Pytests on Feature Flags #764

Merged
merged 1 commit into from
Sep 18, 2023
Merged

Conversation

noah-paige
Copy link
Contributor

Feature or Bugfix

  • Bugfix

Detail

  • Move tests/modules/test_loader.py to tests/base/test_loader.py as originally the test_loader tests were being excluded because of the ignore_module_tests_if_not_active() function in tests/conftests.py

  • Added @pytest.mark.skipif(...) decorators to skip tests if the module features are not enabled in config.json

    • Originally this was failing on some of the test_dataset_location.py tests when modules.datasets.features.file_actions was set to False in config.json

Relates

Security

Please answer the questions below briefly where applicable, or write N/A. Based on
OWASP 10.

N/A

- Does this PR introduce or modify any input fields or queries - this includes
fetching data from storage outside the application (e.g. a database, an S3 bucket)?
  - Is the input sanitized?
  - What precautions are you taking before deserializing the data you consume?
  - Is injection prevented by parametrizing queries?
  - Have you ensured no `eval` or similar functions are used?
- Does this PR introduce any functionality or component that requires authorization?
  - How have you ensured it respects the existing AuthN/AuthZ mechanisms?
  - Are you logging failed auth attempts?
- Are you using or adding any cryptographic features?
  - Do you use a standard proven implementations?
  - Are the used keys controlled by the customer? Where are they stored?
- Are you introducing any new policies/roles/users?
  - Have you used the least-privilege principle? How?

By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.

@noah-paige
Copy link
Contributor Author

noah-paige commented Sep 14, 2023

Tested in AWS Deployment:

  • test_loader.py tests are run and are passing in the IntegrationTests Step of CodePipeline:
    Screenshot 2023-09-14 at 3 18 10 PM

  • When modules.datasets.features.file_actions set to False in config.json

    • Tests related to feature are skipped
    • CodePipeline deploys successfully with features successfully disabled
      Screenshot 2023-09-14 at 2 57 43 PM
  • When modules.datasets.features.file_actions set to True in config.json

    • Tests related to feature are run and are passing
    • CodePipeline deploys successfully with features successfully enabled

@noah-paige noah-paige self-assigned this Sep 14, 2023
@dlpzx dlpzx requested a review from nikpodsh September 18, 2023 06:35
Copy link
Contributor

@dlpzx dlpzx left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good. I am doing a final test in AWS. It works as expected with datasets active but file features disabled:
image

@noah-paige noah-paige merged commit 74da2ad into main Sep 18, 2023
8 checks passed
@dlpzx dlpzx deleted the fix/feature-flags-pytest branch November 8, 2023 08:37
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants