-
Notifications
You must be signed in to change notification settings - Fork 42
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Supervised Contrastive Loss #518
base: staging
Are you sure you want to change the base?
Conversation
Codecov Report
@@ Coverage Diff @@
## staging #518 +/- ##
========================================
Coverage 90.09% 90.09%
========================================
Files 7 7
Lines 404 404
========================================
Hits 364 364
Misses 40 40 Continue to review full report at Codecov.
|
@rflperry Does this PR help your query about contrastive loss? |
Yeah seems like it matches my results here that the transfer ability goes down, which I find interesting but the reason why I'm still a bit intrigued by. Not really worth adding if just always worse? I forget why I had multiple different results with different labels. |
My takeaways/summary:
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@waleeattia Put figure in correct folder & save in pdf
format.
@PSSF23 fixed! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Remove commented code & unnecessary prints. After these LGTM.
@PSSF23 Perfect, just made those changes. Thank you! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Still some commented code remaining in benchmarks/cifar_exp/plot_compare_two_algos.py
@waleeattia
@PSSF23 Sorry I missed that, it should be good now. |
Reference issue
#426
Type of change
Implementing supervised contrastive loss
Adding plotting script to compare accuracies and transfer efficiencies
What does this implement/fix?
Implementing contrastive loss explicitly learns the progressive learning network transformer by penalizing samples of different classes that are close to one another. The new script enables two dnn algorithms to be compared by plotting the difference between their accuracies and transfer efficiencies. The accuracy of the supervised contrastive loss version improves by 6 percent compared to the PL network with categorical cross entropy.
Additional information
NDD 2021