-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Documentation, test and normalization for cross entropy calculation #45
Conversation
Codecov Report
@@ Coverage Diff @@
## ce_fix #45 +/- ##
==========================================
- Coverage 76.43% 75.58% -0.85%
==========================================
Files 15 16 +1
Lines 1379 1110 -269
Branches 147 107 -40
==========================================
- Hits 1054 839 -215
+ Misses 321 267 -54
Partials 4 4
Continue to review full report at Codecov.
|
@cDenius, you can happily ignore that one failing test regarding project coverage :) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would change the test case so we are identical to the sklearn implementation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good to me.
* Divide by sample size * Document, test and normalize cross entropy calculation (#45) Co-authored-by: Christof Wendenius <christof.wendenius@kit.edu> Co-authored-by: Eileen Kuehn <eileen.kuehn@kit.edu>
This PR extends documentation for the cross entropy calculation and adds normalisation as well as a simple unit test based on the discussion in Stackoverflow.
Closes #47.