-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix weights to be identical to original commit #3
Conversation
A new Pull Request was created by @ssrothman for branch main. @smuzaffar, @aandvalenzuela, @iarspider, @clacaputo, @cmsbuild, @mandrenguyen can you please review it and eventually sign? Thanks. |
test parameters
|
please test |
+1 Summary: https://cmssdt.cern.ch/SDT/jenkins-artifacts/pull-request-integration/PR-eec557/31014/summary.html The following merge commits were also included on top of IB + this PR after doing git cms-merge-topic:
You can see more details here: Comparison SummarySummary:
|
please test
|
+1 Summary: https://cmssdt.cern.ch/SDT/jenkins-artifacts/pull-request-integration/PR-eec557/31022/summary.html The following merge commits were also included on top of IB + this PR after doing git cms-merge-topic: You can see more details here: Comparison SummarySummary:
|
|
@ssrothman @kpedro88 any news on this? None of you was present at the today's ORP meeting, where this issue was addressed... |
@perrotta apologies, I'm away at a workshop this week. We also saw this behavior in our private tests last week and realized that there is some randomness inherent to the network itself. The random behavior has been there all along, but http://github.com/cms-sw/cmssw/pull/40814 may actually have been the first time that comparison tests were run for 10805.31 (since it is not part of the short matrix), so it wasn't noticed before. This PR does correctly restore the original weights, but we need to make some more changes to make the network deterministic (this is a work in progress right now). |
@ssrothman @kpedro88 Therefore:
|
+1 |
merge |
When updating the model files in the previous PR I accidentally used weights from a different training. The physics performance between the two trainings is identical, but for consistency this PR reverts the weights to be identical with the original training.