Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

CI/CD Improvements to Incorporate Testing and Automation Checks #342

Closed
howieavp76 opened this issue Apr 24, 2019 · 11 comments
Closed

CI/CD Improvements to Incorporate Testing and Automation Checks #342

howieavp76 opened this issue Apr 24, 2019 · 11 comments
Assignees
Labels
LoE: Medium Scope: CI/CD Enhancements to the project's Continuous Integration and Continuous Delivery pipeline. User Story

Comments

@howieavp76
Copy link

User Story:

As an OSCAL content creator, I wish to have new XML, JSON, and schema artifacts validated via automated testing prior to being checked into the master branch.

Goals:

The primary goals are listed below:

  • Prior to checking code into the master branch, the CI/CD pipeline will run the automated test suites and ensure content passes before allowing a pull request to be approved.

  • Validates XML and JSON files (catalogs and profiles) against their schemas

  • Leverages freely available open source tools to conduct the testing

Dependencies:

  • Assumes that XML and JSON files have valid content and the schemas are up to date. May require substantial work to update existing content for tests to pass. If the content cannot be updated this sprint, may need to make the CI/CD checks informational versus blocking for merge requests.

Acceptance Criteria

  • Test scripts can be called from the CI/CD pipeline and results parsed to generate Pass/Fail information prior to the PR being executed

  • Increase in test coverage as defined in the Testing Requirements Document with documented test results

  • Published ReadMe file with instructions and open source tooling required to support the testing

@howieavp76
Copy link
Author

howieavp76 commented Apr 25, 2019

Current test suite status is below. I have the transforms working and tied into the existing validation routines. However, some of the specified transform files do not exist and in other cases the transform file to use is not specified. Will work with Wendell on refining.

ID Process Primary Secondary Result
1. Metaschema validation, top level metaschemas must be validated against their XSD and Schematron
1.1 XSD schema\metaschema\oscal-catalog-metaschema.xml build\metaschema\lib\metaschema.xsd Implemented
1.2 Schematron schema\metaschema\oscal-catalog-metaschema.xml build\metaschema\lib\metaschema-check.sch ERRORS?
1.3 XSD schema\metaschema\oscal-profile-metaschema.xml build\metaschema\lib\metaschema.xsd Implemented
1.4 Schematron schema\metaschema\oscal-catalog-metaschema.xml build\metaschema\lib\metaschema-check.sch ERRORS?
2. Metaschema validation after composition. Likewise, a modular metaschema can be “composed” into a flat metaschema and validated. This is a better test than series 1.
2.1 XSLT schema\metaschema\oscal-catalog-metaschema.xml build\metaschema\lib\metaschema-compose.xsl COMPOSED_CATALOG_XML – XSL file does not exist
XSD COMPOSED_CATALOG_XML build\metaschema\lib\metaschema.xsd ERRORS?
Schematron COMPOSED_CATALOG_XML build\metaschema\lib\metaschema-check.sch ERRORS?
2.2 XSLT schema\metaschema\oscal-profile-metaschema.xml build\metaschema\lib\metaschema-compose.xsl COMPOSED_PROFILE_XML – XSL file does not exist
XSD COMPOSED_PROFILE_XML build\metaschema\lib\metaschema.xsd ERRORS?
Schematron COMPOSED_PROFILE_XML build\metaschema\lib\metaschema-check.sch ERRORS?
3. Generation of XML Schemas from metaschemas
3.1 Date check schema\metaschema\oscal-catalog-metaschema.xml schema\xml\oscal-catalog-schema.xsd Okay if primary predates secondary - Implemented
3.2 Date check schema\metaschema\oscal-profile-metaschema.xml schema\xml\oscal-profile-schema.xsd Okay if primary predates secondary - Implemented
4. Generation of JSON Schemas from metaschemas
4.1 Date check schema\metaschema\oscal-catalog-metaschema.xml schema\json\oscal-catalog-schema.json Okay if primary predates secondary - Implemented
4.2 Date check schema\metaschema\oscal-profile-metaschema.xml schema\json\oscal-profile-schema.json Okay if primary predates secondary - Implemented
5. Content check of canonical examples (XML). All canonical XML examples should be validated against their respective schemas.
5.1 XSD content\nist.gov\SP800-53\rev4\NIST_SP-800-53_rev4_catalog.xml schema\xml\oscal-catalog-schema.xsd ERRORS? - Implemented
5.2 XSD content\nist.gov\SP800-53\rev4\NIST_SP-800-53_rev4_LOW-baseline_profile.xml schema\xml\oscal-profile-schema.xsd ERRORS? - Implemented
5.3 XSD content\nist.gov\SP800-53\rev4\NIST_SP-800-53_rev4_LOW-baseline_profile.xml schema\xml\oscal-profile-schema.xsd ERRORS? - Implemented
5.4 XSD content\nist.gov\SP800-53\rev4\NIST_SP-800-53_rev4_LOW-baseline_profile.xml schema\xml\oscal-profile-schema.xsd ERRORS? - Implemented
5.5 XSD content\fedramp.gov\FedRAMP_LOW-baseline_profile.xml schema\xml\oscal-profile-schema.xsd ERRORS? - Implemented
5.6 XSD content\fedramp.gov\FedRAMP_MODERATE-baseline_profile.xml schema\xml\oscal-profile-schema.xsd ERRORS? - Implemented
5.7 XSD content\fedramp.gov\FedRAMP_HIGH-baseline_profile.xml schema\xml\oscal-profile-schema.xsd ERRORS? - Implemented
6. Creation of JSON versions of canonical examples.
6.1 Date check content\nist.gov\SP800-53\rev4\NIST_SP-800-53_rev4_catalog.xml content\nist.gov\SP800-53\rev4\NIST_SP-800-53_rev4_catalog.json Okay if primary predates secondary - Implemented
6.2 Date check content\nist.gov\SP800-53\rev4\NIST_SP-800-53_rev4_LOW-baseline_profile.xml content\nist.gov\SP800-53\rev4\NIST_SP-800-53_rev4_LOW-baseline_profile.json Okay if primary predates secondary - Implemented
6.3 Date check content\nist.gov\SP800-53\rev4\NIST_SP-800-53_rev4_MODERATE-baseline_profile.xml content\nist.gov\SP800-53\rev4\NIST_SP-800-53_rev4_MODERATE-baseline_profile.json Okay if primary predates secondary - Implemented
6.4 Date check content\nist.gov\SP800-53\rev4\NIST_SP-800-53_rev4_HIGH-baseline_profile.xml content\nist.gov\SP800-53\rev4\NIST_SP-800-53_rev4_HIGH-baseline_profile.json Okay if primary predates secondary - Implemented
6.5 Date check content\fedramp.gov\FedRAMP_LOW-baseline_profile.xml content\fedramp.gov\FedRAMP_LOW-baseline_profile.json Okay if primary predates secondary- Implemented, missing file
6.6 Date check content\fedramp.gov\FedRAMP_MODERATE-baseline_profile.xml content\fedramp.gov\FedRAMP_MODERATE-baseline_profile.json Okay if primary predates secondary- Implemented, missing file
6.7 Date check content\fedramp.gov\FedRAMP_HIGH-baseline_profile.xml content\fedramp.gov\FedRAMP_HIGH-baseline_profile.json Okay if primary predates secondary- Implemented, missing file
7. Content check of canonical examples (JSON). All canonical JSON examples should be validated against their respective schemas.
7.1 JSON Schema validation content\nist.gov\SP800-53\rev4\NIST_SP-800-53_rev4_catalog.json schema\json\oscal-catalog-schema.json implemented
7.2 JSON Schema validation content\nist.gov\SP800-53\rev4\NIST_SP-800-53_rev4_LOW-baseline_profile.json schema\json\oscal-profile-schema.json implemented
7.3 JSON Schema validation content\nist.gov\SP800-53\rev4\NIST_SP-800-53_rev4_MODERATE-baseline_profile.json schema\json\oscal-profile-schema.json implemented
7.4 JSON Schema validation content\nist.gov\SP800-53\rev4\NIST_SP-800-53_rev4_HIGH-baseline_profile.json schema\json\oscal-profile-schema.json implemented
7.5 JSON Schema validation content\fedramp.gov\FedRAMP_LOW-baseline_profile.json schema\json\oscal-profile-schema.json Implemented, missing file
7.6 JSON Schema validation content\fedramp.gov\FedRAMP_MODERATE-baseline_profile.json schema\json\oscal-profile-schema.json Implemented, missing file
7.7 JSON Schema validation content\fedramp.gov\FedRAMP_HIGH-baseline_profile.json schema\json\oscal-profile-schema.json Implemented, missing file
8. Viability test of XML-> JSON conversion. XML examples should convert into JSON with loss and back again.
8.1 XSLT content\nist.gov\SP800-53\rev4\NIST_SP-800-53_rev4_catalog.xml TEMP_NIST_SP-800-53_rev4_catalog.json What xsl file does the conversion?
XSLT TEMP_NIST_SP-800-53_rev4_catalog.json TEMP_NIST_SP-800-53_rev4_catalog.xml
Compare content\nist.gov\SP800-53\rev4\NIST_SP-800-53_rev4_catalog.xml TEMP_NIST_SP-800-53_rev4_catalog.xml Okay if the same
8.2 [Same three steps] content\nist.gov\SP800-53\rev4\NIST_SP-800-53_rev4_LOW-baseline_profile.xml TEMP_\NIST_SP-800-53_rev4_LOW-baseline_profile.xml What xsl file does the conversion?
8.3 [Same three steps] content\nist.gov\SP800-53\rev4\NIST_SP-800-53_rev4_LOW-baseline_profile.xml TEMP_\NIST_SP-800-53_rev4_LOW-baseline_profile.xml What xsl file does the conversion?
8.4 [Same three steps] content\nist.gov\SP800-53\rev4\NIST_SP-800-53_rev4_LOW-baseline_profile.xml TEMP_\NIST_SP-800-53_rev4_LOW-baseline_profile.xml What xsl file does the conversion?
8.5 [Same three steps] content\fedramp.gov\FedRAMP_LOW-baseline_profile.xml TEMP_FedRAMP_LOW-baseline_profile.xml What xsl file does the conversion?
8.6 [Same three steps] content\fedramp.gov\FedRAMP_MODERATE-baseline_profile.xml TEMP_FedRAMP_MODERATE-baseline_profile.xml What xsl file does the conversion?
8.7 [Same three steps] content\fedramp.gov\FedRAMP_HIGH-baseline_profile.xml TEMP_FedRAMP_HIGH-baseline_profile.xml What xsl file does the conversion?
9. Viability test of JSON -> XML conversion. JSON examples should convert into XML without loss and back again.
9.1 XSLT content\nist.gov\SP800-53\rev4\NIST_SP-800-53_rev4_catalog.json TEMP_NIST_SP-800-53_rev4_catalog.xml What xsl file does the conversion?
XSLT TEMP_NIST_SP-800-53_rev4_catalog.xml TEMP_NIST_SP-800-53_rev4_catalog.json
Compare content\nist.gov\SP800-53\rev4\NIST_SP-800-53_rev4_catalog.json TEMP_NIST_SP-800-53_rev4_catalog.json Okay if the same
9.2 [Same three steps] content\nist.gov\SP800-53\rev4\NIST_SP-800-53_rev4_LOW-baseline_profile.json TEMP_\NIST_SP-800-53_rev4_LOW-baseline_profile.json What xsl file does the conversion?
9.3 [Same three steps] content\nist.gov\SP800-53\rev4\NIST_SP-800-53_rev4_LOW-baseline_profile.json TEMP_\NIST_SP-800-53_rev4_LOW-baseline_profile.json What xsl file does the conversion?
9.4 [Same three steps] content\nist.gov\SP800-53\rev4\NIST_SP-800-53_rev4_LOW-baseline_profile.json TEMP_\NIST_SP-800-53_rev4_LOW-baseline_profile.json What xsl file does the conversion?
9.5 [Same three steps] content\fedramp.gov\FedRAMP_LOW-baseline_profile.json TEMP_FedRAMP_LOW-baseline_profile.json What xsl file does the conversion?
9.6 [Same three steps] content\fedramp.gov\FedRAMP_MODERATE-baseline_profile.json TEMP_FedRAMP_MODERATE-baseline_profile.json What xsl file does the conversion?
9.7 [Same three steps] content\fedramp.gov\FedRAMP_HIGH-baseline_profile.json TEMP_FedRAMP_HIGH-baseline_profile.json What xsl file does the conversion?

@howieavp76
Copy link
Author

Refactor is complete to tighten up Python classes and to drive bash test script files via configuration. Work in progress to do "auto-build" of conversion scripts to run before testing to ensure the latest versions are used instead of legacy artifacts that may be out of sync.

@iMichaela
Copy link
Contributor

5/2/2019

No progress was made yet on this issue.

@david-waltermire david-waltermire added this to the OSCAL 1.0 M1 milestone May 8, 2019
@david-waltermire david-waltermire added the Scope: CI/CD Enhancements to the project's Continuous Integration and Continuous Delivery pipeline. label May 9, 2019
@brian-ruf
Copy link
Contributor

5/9/2019

@david-waltermire-nist to schedule meeting with @howieavp76.

@david-waltermire
Copy link
Contributor

This is being worked as part of #133 as well. We can close both once PR #358 is accepted.

@howieavp76
Copy link
Author

howieavp76 commented May 23, 2019

Some updates for this week:

  • Removed the legacy pip packages and Python scripts in favor of AJV-CLI, now removed from the CI/CD build process in my branch
  • Tested XMLDIFF package versus the custom coded Python script. Some observations:
    • XMLDIFF is superior for XML files with small differences. I ran tests with files that were the same and files that had 1 - 3 lines with minor changes and it worked well.
    • When comparing actual round trip conversions, XMLDIFF could not handle large differences in the files. The job would not complete (gave it over 1 hour and it never completed).
    • Custom Python script is much more efficient using recursion. I was able to successfully add XPath info to the output for debug per NIST request.

Issues:

- Need another working session on CI/CD.  Cannot get it working locally, run into a read only permission error when trying to commit code changes.  I can overcome by adding a token but that causes two problems: 1) token is checked into a public repo which is bad for security, 2) token would break CI/CD on the NIST repo if checked into master there.  Need guidance on how this is setup in NIST Master and will mimic in my fork.  Need to be able to run locally to plug the enhancements into the existing scripts.
- Need the working session to talk through errors when running tests locally.  I can add things to the existing scripts but it throws errors.  I believe this is because things are being injected in CI/CD as variables which I cannot read locally.  Overcoming the issue above will resolve this problem.  Want to understand what is being passed so I can troubleshoot locally and work through testing prior to committing code for review.  

With these changes, we have the ability to have 100% code coverage for what is in the testing requirements document. Just need some troubleshooting sessions to get it properly plugged into CI/CD and to establish a better workflow.

@david-waltermire
Copy link
Contributor

This is a quick summary of what is left to do:

Need to wrap up items 8 and 9 above focused on round trips of:

  1. XML (A) -> JSON (B) -> XML (C)
  2. JSON (A) -> XML (B) -> JSON (C)

#1 addresses item 8 and #2 item 9 above, which @howieavp76 is working on. Comparison of A and C versions will verify lossless round-trips. Minimally, we should implement #1. We gain a minimal extra amount of assurance by doing #2, but, IMHO, this is not strictly needed to move forward.

@howieavp76
Copy link
Author

@david-waltermire-nist - #1 is working end to end in my branch for the XML round trip. This works perfectly for the NIST 800-53 docs. It is throwing errors on the FedRAMP docs that I am still troubleshooting. It has been refactored to use the config logic in the other bash scripts. I still have a few refactoring steps to do today from our call last week:

  • Write the temporal files to a Temp directory
  • Setup naming convention so their source is obvious
  • Output the temp files as artifacts into CI/CD for debugging
  • Fix the FedRAMP processing bug

Once all of this is working, the same code will address #2. It is doing the same steps/scripts just in inverse.

@howieavp76
Copy link
Author

@david-waltermire-nist - Update from yesterday:

  • Files now written to a temp directory with an intelligent naming convention
  • Added temp files to the artifacts in CI/CD
  • Refactored Python code to work in CI/CD
  • Added job to CI/CD to execute the round trip XML->JSON->XML conversions

The job is executing but working through different bugs that are popping up. Remaining work:

  • Make the JAR file dynamic, working through how you did that for Saxon in your script and will refactor mine once I understand it
  • Need to fix FedRAMP content bugs

After those fixes are in, will reverse the logic of my code and we will have JSON->XML->JSON conversions working with the same code.

@howieavp76
Copy link
Author

All feedback addressed and PR #405 submitted for review and approval.

@david-waltermire
Copy link
Contributor

I am going to close this, since the small amount of remaining work is tracked in issue #343 and PR #410.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
LoE: Medium Scope: CI/CD Enhancements to the project's Continuous Integration and Continuous Delivery pipeline. User Story
Projects
None yet
Development

No branches or pull requests

5 participants