-
Notifications
You must be signed in to change notification settings - Fork 27.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[CI] generate separate report files as artifacts #7995
Conversation
This looks good but I only see |
you must be looking at the wrong job? As I said I only did it for one job at the moment - this one:
|
Ah my bad, I miscklicked! |
@sshleifer or @sgugger - I configured github artifacts in I hope I did it right, I added:
I'm not sure whether this should be I currently added it only to For reference, the information on this setup is at this 2 pages: |
I can't figure out how to run a github actions workflow against a branch. It looks good enough that we I'm happy to just acknowledge that this could break on merge, in which case we'd send a follow up PR. |
Thank you for trying, @sshleifer Ah, it's not finished yet, merge-wise - it's very rough on edges.
Can you suggest a different way of testing this? This was your recommendation in first place - to test it on a PR branch - except I can't test it since I don't have permissions to access the runners. Surely there must be a way of testing this? Alternatively, we could go as simple as creating a new github workflow job that simply runs a job of Earlier you were talking about some possible problems with this - something about the job being always successful, I can't find that comment - but I am pretty sure there is no such issue with the approach I implemented - where |
Don't wait, just make a sensible choice that's easy to change. Lean towards fewer reports.
I don't know a good way of testing github actions. act looks promising, but I've never used it. The issue is not permissions it is that github workflows, afaict, cannot be run against arbitrary branches. There is a "rerun all jobs" button, but it will just rerun on master. Would be incredibly valuable if you figured out how to test github actions locally. Here is everything I can see for self-push at https://github.com/huggingface/transformers/actions/runs/326336555/workflow |
I agree with Sam that we can merge to test and iterate if the reports look wrong (as soon as we're sure that the circleCI part is good to go, which we can test on this PR). From what I understand, the PR adds a new job, so it does not break the existing ones/reports. |
I will work on completing this and we can put it in for one circle-ci and one github workflow and see how it goes - thank you for your feedback, @sshleifer and @sgugger |
This is good to merge. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is incredibly cool! Thanks a lot @stas00!
* better reports * a whole bunch of reports in their own files * clean up * improvements * github artifacts experiment * style * complete the report generator with multiple improvements/fixes * fix * save all reports under one dir to easy upload * can remove temp failing tests * doc fix * some cleanup
This PR solves #7887 to produce easier to use reports on CIs.
adds an optional
--make_reports=id
topytest
- e.g.--make_reports=examples
. It then uses that id to generatereport_{id}_{reports}.txt
- this was needed since some jobs like on scheduled jobs have multiple pytest runs, so a unique string is required. W/o this new flag everything remains as is - i.e. no reports get generatedthe generated reports are all saved under
reports
to simplify the upload and are at the moment (assumingid
wastests
):We no longer need any
pytests
flags to generate these - e.g. no need for-rA
or-durations=
- they are all done internally.The code itself is a bit of a hack, that borrows a lot of
pytest
internals - but that's a start - I will see if I can find a public API to accomplish the same later if this new functionality catches on. Actually, it's pretty safe since it calls the same report functionspytest
uses, so it's unlikely to break.added the reporting to:
run_examples_torch
andrun_tests_torch
jobsrun_all_tests_torch_and_tf_gpu
job. (this one generates 3 (!) groups of reports)Once these are tested on
master
and the results are satisfactory, I will add this new functionality to the rest of the jobs.This is what you want to review:
Fixes: #7887
@sshleifer, @sgugger, @LysandreJik