Skip to content

Commit

Permalink
changes in submission documentation (#1946)
Browse files Browse the repository at this point in the history
  • Loading branch information
anandhu-eng authored Nov 21, 2024
1 parent 8e4d59e commit 2ccd3a9
Showing 1 changed file with 83 additions and 73 deletions.
156 changes: 83 additions & 73 deletions docs/submission/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,8 @@ hide:

Click [here](https://youtu.be/eI1Hoecc3ho) to view the recording of the workshop: Streamlining your MLPerf Inference results using CM.

Click [here](https://docs.google.com/presentation/d/1cmbpZUpVr78EIrhzyMBnnWnjJrD-mZ2vmSb-yETkTA8/edit?usp=sharing) to view the prposal slide for Common Automation for MLPerf Inference Submission Generation through CM.

=== "CM based benchmark"
If you have followed the `cm run` commands under the individual model pages in the [benchmarks](../index.md) directory, all the valid results will get aggregated to the `cm cache` folder. The following command could be used to browse the structure of inference results folder generated by CM.
### Get results folder structure
Expand All @@ -35,18 +37,20 @@ Click [here](https://youtu.be/eI1Hoecc3ho) to view the recording of the workshop
| ├── mlperf_log_detail.txt
| ├── mlperf_log_accuracy.json
| └── accuracy.txt
└── Compliance_Test_ID
├── Performance
| └── run_x/#1 run for all scenarios
| ├── mlperf_log_summary.txt
| └── mlperf_log_detail.txt
├── Accuracy
| ├── baseline_accuracy.txt
| ├── compliance_accuracy.txt
| ├── mlperf_log_accuracy.json
| └── accuracy.txt
├── verify_performance.txt
└── verify_accuracy.txt #for TEST01 only
|── Compliance_Test_ID
| ├── Performance
| | └── run_x/#1 run for all scenarios
| | ├── mlperf_log_summary.txt
| | └── mlperf_log_detail.txt
| ├── Accuracy
| | ├── baseline_accuracy.txt
| | ├── compliance_accuracy.txt
| | ├── mlperf_log_accuracy.json
| | └── accuracy.txt
| ├── verify_performance.txt
| └── verify_accuracy.txt #for TEST01 only
|── user.conf
└── measurements.json
```

<details>
Expand All @@ -67,67 +71,69 @@ Once all the results across all the models are ready you can use the following c

## Generate actual submission tree

=== "Closed Edge"
### Closed Edge Submission
```bash
cm run script --tags=generate,inference,submission \
--clean \
--preprocess_submission=yes \
--run-checker \
--submitter=MLCommons \
--tar=yes \
--env.CM_TAR_OUTFILE=submission.tar.gz \
--division=closed \
--category=edge \
--env.CM_DETERMINE_MEMORY_CONFIGURATION=yes \
--quiet
```

=== "Closed Datacenter"
### Closed Datacenter Submission
```bash
cm run script --tags=generate,inference,submission \
--clean \
--preprocess_submission=yes \
--run-checker \
--submitter=MLCommons \
--tar=yes \
--env.CM_TAR_OUTFILE=submission.tar.gz \
--division=closed \
--category=datacenter \
--env.CM_DETERMINE_MEMORY_CONFIGURATION=yes \
--quiet
```
=== "Open Edge"
### Open Edge Submission
```bash
cm run script --tags=generate,inference,submission \
--clean \
--preprocess_submission=yes \
--run-checker \
--submitter=MLCommons \
--tar=yes \
--env.CM_TAR_OUTFILE=submission.tar.gz \
--division=open \
--category=edge \
--env.CM_DETERMINE_MEMORY_CONFIGURATION=yes \
--quiet
```
=== "Open Datacenter"
### Closed Datacenter Submission
```bash
cm run script --tags=generate,inference,submission \
--clean \
--preprocess_submission=yes \
--run-checker \
--submitter=MLCommons \
--tar=yes \
--env.CM_TAR_OUTFILE=submission.tar.gz \
--division=open \
--category=datacenter \
--env.CM_DETERMINE_MEMORY_CONFIGURATION=yes \
--quiet
```
=== "Docker run"
### Docker run
=== "Closed"
### Closed Submission
```bash
cm docker script --tags=generate,inference,submission \
--clean \
--preprocess_submission=yes \
--run-checker \
--submitter=MLCommons \
--tar=yes \
--env.CM_TAR_OUTFILE=submission.tar.gz \
--division=closed \
--env.CM_DETERMINE_MEMORY_CONFIGURATION=yes \
--quiet
```

=== "Open"
### Open Submission
```bash
cm docker script --tags=generate,inference,submission \
--clean \
--preprocess_submission=yes \
--run-checker \
--submitter=MLCommons \
--tar=yes \
--env.CM_TAR_OUTFILE=submission.tar.gz \
--division=open \
--env.CM_DETERMINE_MEMORY_CONFIGURATION=yes \
--quiet
```

=== "Native run"
### Native run
=== "Closed"
### Closed Submission
```bash
cm run script --tags=generate,inference,submission \
--clean \
--preprocess_submission=yes \
--run-checker \
--submitter=MLCommons \
--tar=yes \
--env.CM_TAR_OUTFILE=submission.tar.gz \
--division=closed \
--env.CM_DETERMINE_MEMORY_CONFIGURATION=yes \
--quiet
```

=== "Open"
### Open Submission
```bash
cm run script --tags=generate,inference,submission \
--clean \
--preprocess_submission=yes \
--run-checker \
--submitter=MLCommons \
--tar=yes \
--env.CM_TAR_OUTFILE=submission.tar.gz \
--division=open \
--env.CM_DETERMINE_MEMORY_CONFIGURATION=yes \
--quiet
```

* Use `--hw_name="My system name"` to give a meaningful system name. Examples can be seen [here](https://github.com/mlcommons/inference_results_v3.0/tree/main/open/cTuning/systems)

Expand All @@ -137,6 +143,10 @@ Once all the results across all the models are ready you can use the following c

* Use `--results_dir` option to specify the results folder for Non CM based benchmarks

* Use `--category` option to specify the category for which submission is generated(datacenter/edge). By default, the category is taken from `system_meta.json` file located in the SUT root directory.

* Use `--submission_base_dir` to specify the directory to which outputs from preprocess submission script and final submission is to be dumped. No need to provide `--submission_dir` along with this. For `docker run`, use `--submission_base_dir` instead of `--submission_dir`.

The above command should generate "submission.tar.gz" if there are no submission checker issues and you can upload it to the [MLCommons Submission UI](https://submissions-ui.mlcommons.org/submission).

## Aggregate Results in GitHub
Expand Down

0 comments on commit 2ccd3a9

Please sign in to comment.