From 0eefbacfac5d5c3c5114345e7d19e5a6282b95a0 Mon Sep 17 00:00:00 2001 From: Keith Achorn Date: Tue, 30 Jul 2024 12:11:37 -0700 Subject: [PATCH] Update README.md Signed-off-by: Keith Achorn --- pytorch/README.md | 13 +++++++++++++ 1 file changed, 13 insertions(+) diff --git a/pytorch/README.md b/pytorch/README.md index d94070cfc..a8c4d775a 100644 --- a/pytorch/README.md +++ b/pytorch/README.md @@ -332,6 +332,19 @@ You can find the list of services below for each container in the group: | `xpu-jupyter` | Adds Jupyter notebook server to GPU image | | `serving` | [TorchServe*] | +## MLPerf Optimized Workloads + +The following images are available for MLPerf-optimized workloads. Instructions are available [here](https://www.intel.com/content/www/us/en/developer/articles/guide/get-started-mlperf-intel-optimized-docker-images.html). + +| Tag(s) | Base OS | MLPerf Round | Target Platform | +| --------------------------------- | ---------------- | ---------------- | ------------------------------- | +| `mlperf-inference-4.1-resnet50` | [rockylinux:8.7] | [Inference v4.1] | Intel(R) Xeon(R) Platinum 8592+ | +| `mlperf-inference-4.1-retinanet` | [ubuntu:22.04] | [Inference v4.1] | Intel(R) Xeon(R) Platinum 8592+ | +| `mlperf-inference-4.1-gptj` | [ubuntu:22.04] | [Inference v4.1] | Intel(R) Xeon(R) Platinum 8592+ | +| `mlperf-inference-4.1-bert` | [ubuntu:22.04] | [Inference v4.1] | Intel(R) Xeon(R) Platinum 8592+ | +| `mlperf-inference-4.1-dlrmv2` | [rockylinux:8.7] | [Inference v4.1] | Intel(R) Xeon(R) Platinum 8592+ | +| `mlperf-inference-4.1-3dunet` | [ubuntu:22.04] | [Inference v4.1] | Intel(R) Xeon(R) Platinum 8592+ | + ## License View the [License](https://github.com/intel/intel-extension-for-pytorch/blob/main/LICENSE) for the [IntelĀ® Extension for PyTorch*].