Skip to content

Commit

Permalink
Remove GC from README (#943)
Browse files Browse the repository at this point in the history
  • Loading branch information
michaelbenayoun authored Mar 31, 2023
1 parent 0e60153 commit 67118b5
Showing 1 changed file with 0 additions and 52 deletions.
52 changes: 0 additions & 52 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,6 @@ If you'd like to use the accelerator-specific features of 🤗 Optimum, you can
| [ONNX Runtime](https://onnxruntime.ai/docs/) | `python -m pip install optimum[onnxruntime]` |
| [Intel Neural Compressor](https://www.intel.com/content/www/us/en/developer/tools/oneapi/neural-compressor.html) | `python -m pip install optimum[neural-compressor]`|
| [OpenVINO](https://docs.openvino.ai/latest/index.html) | `python -m pip install optimum[openvino,nncf]` |
| [Graphcore IPU](https://www.graphcore.ai/products/ipu) | `python -m pip install optimum[graphcore]` |
| [Habana Gaudi Processor (HPU)](https://habana.ai/training/) | `python -m pip install optimum[habana]` |

To install from source:
Expand Down Expand Up @@ -148,15 +147,10 @@ You can find more examples in the [documentation](https://huggingface.co/docs/op
We support many providers:

- Habana's Gaudi processors
- Graphcore's IPUs
- ONNX Runtime (optimized for GPUs)

### Habana

<!--
To train transformers on Habana's Gaudi processors, 🤗 Optimum provides a `GaudiTrainer` that is very similar to the 🤗 Transformers [Trainer](https://huggingface.co/docs/transformers/main_classes/trainer). Here is a simple example:
-->
```diff
- from transformers import Trainer, TrainingArguments
+ from optimum.habana import GaudiTrainer, GaudiTrainingArguments
Expand Down Expand Up @@ -189,54 +183,8 @@ To train transformers on Habana's Gaudi processors, 🤗 Optimum provides a `Gau

You can find more examples in the [documentation](https://huggingface.co/docs/optimum/habana/quickstart) and in the [examples](https://github.com/huggingface/optimum-habana/tree/main/examples).


### Graphcore

<!--
To train transformers on Graphcore's IPUs, 🤗 Optimum provides a `IPUTrainer` that is very similar to the 🤗 Transformers [Trainer](https://huggingface.co/docs/transformers/main_classes/trainer). Here is a simple example:
-->
```diff
- from transformers import Trainer, TrainingArguments
+ from optimum.graphcore import IPUConfig, IPUTrainer, IPUTrainingArguments

# Download a pretrained model from the Hub
model = AutoModelForXxx.from_pretrained("bert-base-uncased")

# Define the training arguments
- training_args = TrainingArguments(
+ training_args = IPUTrainingArguments(
output_dir="path/to/save/folder/",
+ ipu_config_name="Graphcore/bert-base-ipu", # Any IPUConfig on the Hub or stored locally
...
)

# Define the configuration to compile and put the model on the IPU
+ ipu_config = IPUConfig.from_pretrained(training_args.ipu_config_name)

# Initialize the trainer
- trainer = Trainer(
+ trainer = IPUTrainer(
model=model,
+ ipu_config=ipu_config
args=training_args,
train_dataset=train_dataset
...
)

# Use Graphcore IPU for training!
trainer.train()
```

You can find more examples in the [documentation](https://huggingface.co/docs/optimum/graphcore/quickstart) and in the [examples](https://github.com/huggingface/optimum-graphcore/tree/main/examples).


### ONNX Runtime

<!--
To train transformers with ONNX Runtime's acceleration features, 🤗 Optimum provides a `ORTTrainer` that is very similar to the 🤗 Transformers [Trainer](https://huggingface.co/docs/transformers/main_classes/trainer). Here is a simple example:
-->

```diff
- from transformers import Trainer, TrainingArguments
+ from optimum.onnxruntime import ORTTrainer, ORTTrainingArguments
Expand Down

0 comments on commit 67118b5

Please sign in to comment.