Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cleanup of cross entropy calculation #38

Merged
merged 2 commits into from
Sep 9, 2021
Merged

Cleanup of cross entropy calculation #38

merged 2 commits into from
Sep 9, 2021

Conversation

cDenius
Copy link
Contributor

@cDenius cDenius commented Sep 1, 2021

Cross entropy calculation now weights results properly by number of samples.
Further, the PR includes documentation, unit tests and normalisation of predictions.

Closes #47.

@codecov
Copy link

codecov bot commented Sep 1, 2021

Codecov Report

Merging #38 (21eb8e9) into main (ccaff9b) will increase coverage by 5.73%.
The diff coverage is 100.00%.

Impacted file tree graph

@@            Coverage Diff             @@
##             main      #38      +/-   ##
==========================================
+ Coverage   74.54%   80.28%   +5.73%     
==========================================
  Files          15       16       +1     
  Lines        1096     1486     +390     
  Branches      105      162      +57     
==========================================
+ Hits          817     1193     +376     
- Misses        275      288      +13     
- Partials        4        5       +1     
Impacted Files Coverage Δ
maskit/utils.py 100.00% <100.00%> (+100.00%) ⬆️
tests/test_utils.py 100.00% <100.00%> (ø)
main.py 0.00% <0.00%> (ø)
maskit/masks.py 100.00% <0.00%> (ø)
tests/test_masks.py 100.00% <0.00%> (ø)
maskit/circuits.py 76.19% <0.00%> (+76.19%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 0313002...21eb8e9. Read the comment docs.

)

* add documentation for cross entropy

* test cross entropy calculation with sklearn log_loss

* add normalization for predictions
@eileen-kuehn eileen-kuehn changed the title cross entropy loss is now divided by batch size Cleanup of cross entropy calculation Sep 9, 2021
Copy link
Member

@eileen-kuehn eileen-kuehn left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great stuff! 🥇

@eileen-kuehn eileen-kuehn merged commit e135be3 into main Sep 9, 2021
@eileen-kuehn eileen-kuehn deleted the ce_fix branch September 9, 2021 10:28
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
2 participants