Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

1881 Add tide metrics to documentation #1918

Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
15 changes: 8 additions & 7 deletions docs/metrics.md
Original file line number Diff line number Diff line change
Expand Up @@ -65,7 +65,7 @@ For targeted attacks, each metric will be reported twice for adversarial data: o

The `armory.metrics` module contains functionality to measure a variety of metrics:
- `armory.metrics.perturbation` metrics measure adversarial perturbations such as `lp` distance
- `armory.metrics.task` metrics measure task performance such as categorical accuracy
- `armory.metrics.task` metrics measure task performance such as categorical accuracy
- `armory.metrics.statistical` metrics measure statistical quantities such as KL divergence
- `armory.metrics.poisoning` module contains helper functions to measure fairness statistics relevant to poisoning and filtering scenarios

Expand Down Expand Up @@ -152,8 +152,9 @@ def load_metrics(self):
| `object_detection_disappearance_rate` | `task.batch.object_detection_disappearance_rate` | Object Detection Disappearance Rate |
| `object_detection_hallucinations_per_image` | `task.batch.object_detection_hallucinations_per_image` | Object Detection Hallucinations Per Image |
| `object_detection_mAP` | `task.population.object_detection_mAP` | Object Detection mean average precision |
| `object_detection_mAP_tide` | `task.population.object_detection_mAP_tide` | [TIDE version](https://dbolya.github.io/tide/) of mean average precision for object detection |
| `object_detection_misclassification_rate` | `task.batch.object_detection_misclassification_rate` | Object Detection Misclassification Rate |
| `object_detection_true_positive_rate` | `task.batch.object_detection_true_positive_rate` | Object Detection True Positive Rate |
| `object_detection_true_positive_rate` | `task.batch.object_detection_true_positive_rate` | Object Detection True Positive Rate |
| `apricot_patch_targeted_AP_per_class` | `task.population.apricot_patch_targeted_AP_per_class` | OD metric applied to apricot scenario |
| `carla_od_AP_per_class` | `task.population.carla_od_AP_per_class` | OD metric applied to carla scenario |
| `carla_od_disappearance_rate` | `task.batch.carla_od_disappearance_rate` | OD metric applied to carla scenario |
Expand Down Expand Up @@ -235,7 +236,7 @@ from armory import metrics
@metrics.task.elementwise
def my_accuracy_metric(y_i, y_pred_i):
return y_i == np.argmax(y_pred_i)
```
```

Armory performs all built-in metric operations as batches, not as individual elements, so using the `elementwise` decorators will also produce a batchwise version of it that loops through the individual elements and provides a batchwise result.
NOTE: when armory uses `get`, it will get the batchwise version of a metric.
Expand Down Expand Up @@ -431,7 +432,7 @@ More generally,
```python
probe.update(func1, func2, func3, my_var=y)
```
will publish the value `func3(func2(func1(y)))`.
will publish the value `func3(func2(func1(y)))`.

#### Interactive Testing

Expand All @@ -442,7 +443,7 @@ from armory.instrument import get_probe, MockSink
probe = get_probe("my_name")
probe.set_sink(MockSink())
probe.update(variable_name=17)
# update probe variable my_name.variable_name to 17
# update probe variable my_name.variable_name to 17
```
This will print all probe updates to the screen.

Expand Down Expand Up @@ -471,7 +472,7 @@ You will need to construct a meter, connect it to a hub, and (optionally) add a
To instantiate a Meter:
```python
from armory.instrument import Meter
meter = Meter(
meter = Meter(
name,
metric,
*metric_arg_names,
Expand Down Expand Up @@ -534,7 +535,7 @@ meter = Meter(
final_kwargs=None,
record_final_only=True,
)
```
```

A more succinct way of doing this, which also handles the case when only batches of `w` and `z` are supplied to the meter, is to use `GlobalMeter`:
```python
Expand Down