Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Optional performance metrics #226

Closed
djdameln opened this issue Apr 12, 2022 · 0 comments · Fixed by #230
Closed

Optional performance metrics #226

djdameln opened this issue Apr 12, 2022 · 0 comments · Fixed by #230
Assignees
Labels
Enhancement New feature or request Metrics Metric Component.
Milestone

Comments

@djdameln
Copy link
Contributor

Anomalib currently supports F1 score and AUROC as performance metrics. Several users have requested that other metrics such as ROC (#186) and Brier score (#199) are added.

The computation of some of these metrics can have high computational cost, especially for threshold-independent metrics such as AUROC. When we keep computing all available metrics like we do now, the metric computation stage of the model pipeline could start taking too much time. So, before we add new performance metrics to Anomalib, we need to implement a mechanism that would allow the user to select which performance metrics they want to include in the evaluation.

@djdameln djdameln added Enhancement New feature or request Metrics Metric Component. labels Apr 12, 2022
@djdameln djdameln added this to the v0.2.7 milestone Apr 12, 2022
@djdameln djdameln self-assigned this Apr 12, 2022
@djdameln djdameln mentioned this issue Apr 13, 2022
11 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Enhancement New feature or request Metrics Metric Component.
Projects
Archived in project
1 participant