Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Create scorecard.yml (OpenSSF) #1708

Merged
merged 1 commit into from
Jul 1, 2024
Merged

Conversation

planetf1
Copy link
Contributor

@planetf1 planetf1 commented Feb 27, 2024

Adds a github action to run the OpenSSF scorecard tool, and posts results to security->code scanning

Fixes #1706

This is a new CI action. It's only effect is

  • small amount of resource to run the workflow
  • creates items under Security->code scanning, visible to those with 'write' access in the repo

No branch protection rules are being changed to require successful completion of this action

No additional testing is required

Example output is shown in the referenced issue

  • [NO] Does this PR change the input/output behaviour of a cryptographic algorithm (i.e., does it change known answer test values)? (If so, a version bump will be required from x.y.z to x.(y+1).0.)
  • [NO] Does this PR change the list of algorithms available -- either adding, removing, or renaming? Does this PR otherwise change an API? (If so, PRs in fully supported downstream projects dependent on these, i.e., oqs-provider and OQS-OpenSSH will also need to be ready for review and merge by the time this is merged.)

As per https://github.com/ossf/scorecard it's possible some additional permissions may be needed (depending on branch rotection setup). These can be dealt with as a second iteration

This PR also does not attempt to address any findings from the scorecard scan. It's purely activating the tool so that we can review and act on those findings subsequently.

Draft for

  • testing
  • minor changes/comments cleanup

@planetf1
Copy link
Contributor Author

@baentsch many thanks for the comments!

@planetf1
Copy link
Contributor Author

PR updated - only generates .sarif file, and runs on both PRs and merges to main

  • can review the sarif file and decide if we want to push to code scanning results in this repo
  • I can squash the commits when we're ready for final review if that's preferred

@planetf1
Copy link
Contributor Author

Example sarif file can be found as attachment on https://github.com/open-quantum-safe/liboqs/actions/runs/8066088271?pr=1708

@planetf1
Copy link
Contributor Author

In the scan above, we have two categories of issues reported:

  • Token Permissions - recommending some refinements to permissions for the github actions

To fix these carries some risk of trial/error, but ultimately should not change results, it's just being more explicit about what we expect an action to do

This requires making the version of github actions we use explicit, so that a release can always be regenerated. The downside is maintenance to update even within minor versions. I see the reasoning for it, it may work better alongside an automated tool like dependabot (I know it works on major minor versions, not sure on commit hash)

I didn't notice other issues in the sarif (viewed with the sarif viewer in vscode, from ms dev labs)

@planetf1
Copy link
Contributor Author

planetf1 commented Apr 9, 2024

Rebased, to review current findings and correct.

@planetf1 planetf1 force-pushed the issue1706 branch 2 times, most recently from e7c98d1 to c0aa784 Compare April 19, 2024 10:17
@planetf1
Copy link
Contributor Author

Update on this PR - good things

  • added pinned SHAs for github actions - this seems to work, and pass the scorecard tests
  • added explicit permissions for github actions - this again seems to work & pass tests
  • updated the scorecard yaml itself as I noticed 2 deprecations (one was in checkout... looking at that across the repo should be a distinct action)

There are however still a few errors relating to freezing of pip dependencies

  • a build failure due to some dependencies in copy_from_upstream being pinned, but not all (? may need recursive definition?)
    - a warning from scorecard where the unix build explicitly does a 'pip install'. this needs to add SHAs (I'd only done requirements.txt)
    • a warning from scorecard with requirements.txt explicit SHAs' Either a bug, or may relate to the build failure - ie I only pinned what was specified, but not the full resolved list of dependencies.....

@planetf1
Copy link
Contributor Author

Pip requirements.txt install was failing in the build since once shas are used, ALL dependencies need to be listed (resursively) and a few were missing.

fixed by

  • installing the existing dependencies into a clear virtual environment (running python -m venv /tmp/env2; . /tmp/env2/bin/activate; pip install -r requirements.txt`)
  • freezing the dependencies to get a seed list pip freeze > requirements.txt2
  • running the following script to get the actual shas
cat requirements.txt | while read line                                                                                                                                            <<<
do
hashin $line
done
  • finally discarding the original & using the new requirements.txt2

Also tested loading the new requirements.txt into a clean pip venv which works fine.

@planetf1
Copy link
Contributor Author

@baentsch the checks are nearly clean, with the only OSSF negative points (score 8/10) caused by:

        run: env HOMEBREW_NO_AUTO_UPDATE=1 brew install ninja && pip3 install --break-system-packages pytest pytest-xdist pyyaml

We have the same in Windows. It's not detected, but should fix in the same way

Aside: ossf scorecard doesn't know/check brew packages, which of course can also change the results!

To address the ossf report, we would need to specify hashes. I did this for our import script (see above) which uses requirements.txt

I'm think this is the cleanest way to do here too - and in future may be easier to maintain if we use dependabot?

I could create the required 'requirements.txt' somewhere and refactor the above. The big question would be where in the source tree. it could go in the .github/workflows directory, but is this clutter? somewhere else? What do you think?

@planetf1
Copy link
Contributor Author

The mac builds are using Python 3.12.x whilst the Windows build is using 3.9.x -- this is why the dependencies resolve differently. These versions are the standard, supplied, python versions for the respective runners

I initially used a common file, then a slight variation from a test machine running windows but with python 3.12

Additional consideration - for consistent results, having a managed, specific version of python may be advisable.

@planetf1
Copy link
Contributor Author

Changed approach to build requirements txt to

pip install pip-tools
pip-compile requirements.txt --generate-hashes -o requirements_new.txt

This is done in a clean venv, and has been tested on macos (3.12) and windows (3.9)

@planetf1
Copy link
Contributor Author

This is now ready for review:

  • Scorecard results are clean with all reported warnings clean
  • results are only reported in sarif file attached to action

Follow-on activity (after merge)

  • Open a new PR to reset the schedule (ie 'on:' to OSSF recommendations) - but get the main PR in first.
  • Add information about scorecard & tools to aid in pinning dependencies to documentation (discuss where)
  • report-back to dev call on change
  • roll out similar changes to other repositories

@planetf1 planetf1 marked this pull request as ready for review April 19, 2024 15:01
@planetf1 planetf1 requested review from bhess and dstebila as code owners April 19, 2024 15:01
@planetf1 planetf1 requested a review from baentsch April 19, 2024 15:17
@ryjones
Copy link
Contributor

ryjones commented Apr 19, 2024

this is really cool! thanks for doing the heavy lift

@baentsch
Copy link
Member

This is now ready for review:

* Scorecard results are clean with all reported warnings clean

Very good, thanks. Also good to know that no software changes are/were required to get to this "green state".

* results are only reported in sarif file attached to action

Follow-on activity (after merge)

* Open a new PR to reset the schedule (ie 'on:' to OSSF recommendations) - but get the main PR in first.

Before doing that, I'd like to

  • get confirmation about which (downstream) project or user needs this; this would not be required if you can guarantee that this tooling will never generate alarms that may cause work to us (i.e., be 100% automated). Looking at the current changes I have doubts about this (i.e., that this has the potential to require manual interventions) but would like to hear your thoughts on this, @planetf1
  • how to ascertain that any warnings this tool may generate will not be considered blockers for use or progress, i.e.,
  • how people can themselves fix any errors this tooling may highlight (aka documentation; see below)
* Add information about scorecard & tools to aid in pinning dependencies to documentation (discuss where)

Such documentation seems prudent to have in this PR already (also see my single questions) to avoid a situation in which things fail for users and they don't know where to look for documentation to "dig themselves" out of the proverbial hole this may have plunged them into. My suggestion would be to place this in a file "PROCEDURES.md" in the "docs" directory. It already should contain three sub items:

  • DCO (why needed/benefits, e.g., with reference to project charter), e.g., to reference in PRs failing because of this
  • CBOM (why needed/benefits with documentation how to handle CI errors if so triggered, avoiding future instances of for example this).
  • Scorecard (why needed/benefits with documentation how to handle CI or local execution errors if so triggered)
* report-back to dev call on change

Good proposal, but this feature must be understandable (by reading documentation) by anyone not attending dev calls too.

* roll out similar changes to other repositories

Suggest to wait with this until
a) this feature is proven to not cause any procedural complications to technical progress (quick-and-easy PRs, etc.)
b) we have finally decided which projects to carry forward at which support/maintenance level

Copy link
Member

@baentsch baentsch left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please add "self-help" documentation as per discussion item(s).

@dstebila
Copy link
Member

Thanks Nigel for getting this started! And thanks Michael for a thorough analysis. We will definitely need some hand-holding as this lands and the first few times it pops up during a PR.

@planetf1
Copy link
Contributor Author

To answer the remaining comments:

  • It's not possible to guarantee 100% that there will be no work from this change for the following reasons
  • If we want to add an additional python module, or change the versions we are using, we have to do more than list the dependency or the version, but now also need to add the sha, and list the entire set of dependencies after transitive resolution. Mitigation: document process
  • If the python version supplied by github were to upgrade python we could find a transitive dependency changes, leading to a build break until above was done. Mitigation: use the github-python plugin to use a relatively fixed version of python (much much smaller change with minor fixes) << as in earlier comment

Both of the above seem reasonable to me given the intent of the scorecard recommended approach is to avoid unknown changes to what is being produced - for example not just a new, but a modified/hacked/illicit version of a library.

  • We have two things
  • a) The scorecard check - this is passive - it is only giving us a score, so will not progress, though we should monitor & plan action if we regress
  • b) the change to the pip install - this is within the PR - but see previous bullet point
  • How people can fix? - documentation

PROCEDURES.md - I can add changes there, but would suggest within this PR it focusses on scorecard, and we raise additional issue/issues to handle DCO and SBOM (which I can also help with). Agree with the point about making sure the change is understandable from the contents of the repo/docs (I could also add a pointer/info in the requirements file itself for extra clarity)

Rolling out to repo - makes sense we allow a little time (a few weeks?) after the change is merged into liboqs (suggest a checkpoint at the TSC?) before working on the other repos -- and of course they should each go through their own pr approval). As to which repos, ultimately any that has any supported code within.

@planetf1
Copy link
Contributor Author

planetf1 commented Apr 22, 2024

So in summary my plan (if reviewers agree) is to:

  • Add setup-python into each build script (using 3.12)
  • remove/rename the additional requirements file
  • Add a docs/PROCEDURES.md describing how the requirements file can be built with hashes etc, and link back from requirements file or script as makes sense
  • Raise issue to document DCO process
  • Raise issue to document CBOM process

Will return to TSC/rollout once this PR is merged.

@planetf1 planetf1 requested a review from baentsch June 17, 2024 18:54
@planetf1
Copy link
Contributor Author

Hopefully I've addressed the review issues. No findings are reported. I propose

  • This PR is merged (once final review is completed). This will continue running the scorecard check within the PR and not publish.
  • A new PR will be opened to move the execution to a scheduled check as per openSSF recommendations,
  • A new PR will be opened to migrate from scorecard v4 to scorecard v5
  • Further PRs will be opened if/when any fixes are needed.

Trying to maintain a PR with build changes (albeit minor) is a little tricky in merging as there is other activity in this area. I hope the changes are sufficient know to close this one out. In addition proactive scans are already being done by the openssf team, and getting our mitigations in place will improve our already public score.

@ryjones
Copy link
Contributor

ryjones commented Jun 22, 2024

@planetf1 could you do a squash commit or similar?

docs/PROCEDURES.md Outdated Show resolved Hide resolved
docs/PROCEDURES.md Outdated Show resolved Hide resolved
docs/PROCEDURES.md Outdated Show resolved Hide resolved
Currently this is used withou `.github/workflows` but the same principle applies elsewhere.

To make this easier, a version of the `requirements.txt` without hashes has been saved as `requirements.in`. This is
to make maintenance easier, but it is not used at script execution time.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So when updating a dependency version, one first has to update "requirements.in", then run the tool, right?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK -- then why not make this explicit in the documentation? Along the lines "When updating dependencies, a contributor should first update 'requirements.in', then run the pip-tools as per the below, then check for correct operation and only then commit."

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Reworded in a more prescriptive way. Hopefully this is clearer. (in next push)

docs/PROCEDURES.md Outdated Show resolved Hide resolved
docs/PROCEDURES.md Outdated Show resolved Hide resolved
Copy link
Member

@baentsch baentsch left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the documentation regarding version pinning.

Good to know there's different tools that update these hashes. That in my mind begs the question what manual step is required (by committers as well as reviewers) to ascertain that the tool did the right thing/obtained a "proper" hash? Without this manual step, one could obviously simply run the tool "blindly" on the (original) "requirements.in" or (version comments in the) CI job .yml files, much like an automated preprocessing step that could be integrated to CI... Or is the recommendation to just check whether the hashes change in a PR?

@planetf1
Copy link
Contributor Author

Thanks for the documentation regarding version pinning.

Good to know there's different tools that update these hashes. That in my mind begs the question what manual step is required (by committers as well as reviewers) to ascertain that the tool did the right thing/obtained a "proper" hash? Without this manual step, one could obviously simply run the tool "blindly" on the (original) "requirements.in" or (version comments in the) CI job .yml files, much like an automated preprocessing step that could be integrated to CI... Or is the recommendation to just check whether the hashes change in a PR?

The reason for suggesting it's manual is that the author is then likely to be closely checking the action logs and output, plus be aware if any failures occur.

The exact hash could be searched for in github to match the version used.

I've added a note about this in PROCEDURES.md

@planetf1
Copy link
Contributor Author

@baentsch Thanks for the comments. I've responded and updated the docs.
I've left conversations open, and will close once you've reviewed (or feel free to close yourself)
@ryjones I can squash once reviews are complete? It may make it easier to see the deltas whilst we are reviewing

@planetf1
Copy link
Contributor Author

Updated docs, rebased - @baentsch please check the updates are ok for you.

@baentsch
Copy link
Member

@planetf1 Our etiquette is that the person who created a PR also merges it, squashing all commits into one with a reasonably small set of commit messages to let posterity know what this is really about (e.g., removing all commit messages related only to PR discussions).

@planetf1
Copy link
Contributor Author

Thanks, that's helpful and what I expected. I'll sort the squash out soon (juggling tasks!) and merge

… action permissions open-quantum-safe#1706

Signed-off-by: Nigel Jones <jonesn@uk.ibm.com>
@planetf1
Copy link
Contributor Author

planetf1 commented Jul 1, 2024

Squashed to one commit following successful review

@baentsch @dstebila I do not have permissions to merge. Once tests complete would you be able to do this?
I know @baentsch you mentioned I should merge, but I'd need to be added to the appropriate access group - if you both think it makes sense we can sort that out ...

@ryjones
Copy link
Contributor

ryjones commented Jul 1, 2024

@planetf1 do you want me to merge it?

@baentsch baentsch merged commit d2089c5 into open-quantum-safe:main Jul 1, 2024
62 checks passed
@baentsch
Copy link
Member

baentsch commented Jul 1, 2024

Merged (shortened title a bit, moving text to contents). Thanks for the contribution @planetf1 !

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Add OpenSSF scorecard
4 participants