Releases: microsoft/vision-evaluation
Releases · microsoft/vision-evaluation
Adding ConfusionMatrixEvaluator
New evaluator added to compute confusion matrix (https://en.wikipedia.org/wiki/Confusion_matrix)
What's Changed
- Make pycococap package optional dependency by @shonohs in #41
- adding confusionmatrix evaluator by @gegeo0 in #40
New Contributors
Full Changelog: 0.2.13...0.2.14
Fix MAP@K bug
Issue fix for MAP@K. Top-K predictions were not ordered which resulted to incorrect computation of map@k
Recall@k, Precision@k, Mean Average Precision@k, Precision-Recall Curve for image retrieval
Added the following metrics relevant to image retrieval:
Recall@k, Precision@k, Mean Average Precision@k, Precision-Recall Curve
Fix mAP evaluator bug
Fix bug: when category ids are not continuous and the prediction / label is tuple instead of list, vision-evaluation throws an exception when attempting to make category ids continuous.
Support indices predictions for prediction filter
Update:
- Adding support for indices predictions for prediction filter
- Adding MeanLpError evaluator
fix dependencies and home page link
0.2.9 fix dependencies and home page link (#31)
Fix AP for multilabel+macro
0.2.8 fix ap calculation under macro for multilabel classification (#30)
Implement group-wise evaluator
0.2.7 Added GroupWiseEvaluator (#29)
Improve efficiency of top1/5 accuracy and AP macro
improve efficiency/memory of ap macro and topk acc (#25) improve efficiency/memory of ap macro and topk acc
Support image matting
0.2.5 Support image matting (#23)