Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update precision/recall after subsetting of confusion matrix #239

Open
thinkh opened this issue Oct 12, 2018 · 1 comment
Open

Update precision/recall after subsetting of confusion matrix #239

thinkh opened this issue Oct 12, 2018 · 1 comment
Assignees

Comments

@thinkh
Copy link
Member

thinkh commented Oct 12, 2018

Currently the metrics curves (precision, recall, F1 score) are always computed for the whole confusion matrix. When filtering classes (i.e., subsetting the confusion matrix) these curves must be updated based on the selected set of classes.

Performance metrics for 10 classes:

image

Same performance metrics for 4 classes:
image

@gfrogat Check how the subsetting/computation is handled.

@thinkh
Copy link
Member Author

thinkh commented Oct 12, 2018

It seems that it is already correct, please check the computed performance values.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants