Skip to content

Commit bcd059a

Browse files
Remove obsolete research_projects directory (#4243)
Co-authored-by: behroozazarkhalili <ermiaazarkhalili> Co-authored-by: Quentin Gallouédec <gallouedec.quentin@gmail.com>
1 parent 0e57b4a commit bcd059a

22 files changed

+0
-2316
lines changed

docs/source/_toctree.yml

Lines changed: 0 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -61,8 +61,6 @@
6161
title: Sentiment Tuning
6262
- local: using_llama_models
6363
title: Training StackLlama
64-
- local: detoxifying_a_lm
65-
title: Detoxifying a Language Model
6664
- local: multi_adapter_rl
6765
title: Multi Adapter RLHF
6866
title: Examples

docs/source/detoxifying_a_lm.md

Lines changed: 0 additions & 201 deletions
This file was deleted.

docs/source/example_overview.md

Lines changed: 0 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -70,8 +70,6 @@ Here are also some easier-to-run colab notebooks that you can use to get started
7070
| [`examples/notebooks/gpt2-sentiment.ipynb`](https://github.com/huggingface/trl/tree/main/examples/notebooks/gpt2-sentiment.ipynb) | This notebook demonstrates how to reproduce the GPT2 imdb sentiment tuning example on a jupyter notebook. |
7171
| [`examples/notebooks/gpt2-control.ipynb`](https://github.com/huggingface/trl/tree/main/examples/notebooks/gpt2-control.ipynb) | This notebook demonstrates how to reproduce the GPT2 sentiment control example on a jupyter notebook. |
7272

73-
We also have some other examples that are less maintained but can be used as a reference in [research_projects](https://github.com/huggingface/trl/tree/main/examples/research_projects). Check out this folder to find the scripts used for some research projects that used TRL (LM de-toxification, Stack-Llama, etc.)
74-
7573
## Distributed training
7674

7775
All the scripts can be run on multiple GPUs by providing the path of an 🤗 Accelerate config file when calling `accelerate launch`. To launch one of them on one or multiple GPUs, run the following command (swapping `{NUM_GPUS}` with the number of GPUs in your machine and `--all_arguments_of_the_script` with your arguments).

docs/source/peft_integration.md

Lines changed: 0 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -3,14 +3,6 @@
33
The notebooks and scripts in these examples show how to use Low Rank Adaptation (LoRA) to fine-tune models in a memory efficient manner. Most of PEFT methods supported in peft library but note that some PEFT methods such as Prompt tuning are not supported.
44
For more information on LoRA, see the [original paper](https://huggingface.co/papers/2106.09685).
55

6-
Here's an overview of the `peft`-enabled notebooks and scripts in the [trl repository](https://github.com/huggingface/trl/tree/main/examples):
7-
8-
| File | Task | Description | Colab link |
9-
| ---| ---| --- |
10-
| [`stack_llama/rl_training.py`](https://github.com/huggingface/trl/blob/main/examples/research_projects/stack_llama/scripts/rl_training.py) | RLHF | Distributed fine-tuning of the 7b parameter LLaMA models with a learned reward model and `peft`. | |
11-
| [`stack_llama/reward_modeling.py`](https://github.com/huggingface/trl/blob/main/examples/research_projects/stack_llama/scripts/reward_modeling.py) | Reward Modeling | Distributed training of the 7b parameter LLaMA reward model with `peft`. | |
12-
| [`stack_llama/supervised_finetuning.py`](https://github.com/huggingface/trl/blob/main/examples/research_projects/stack_llama/scripts/supervised_finetuning.py) | SFT | Distributed instruction/supervised fine-tuning of the 7b parameter LLaMA model with `peft`. | |
13-
146
## Installation
157

168
Note: peft is in active development, so we install directly from their Github page.

examples/research_projects/README.md

Lines changed: 0 additions & 7 deletions
This file was deleted.

examples/research_projects/layer_skip/README.md

Lines changed: 0 additions & 15 deletions
This file was deleted.

examples/research_projects/layer_skip/scripts/benchmark_layer_skip.py

Lines changed: 0 additions & 77 deletions
This file was deleted.

examples/research_projects/layer_skip/scripts/config.py

Lines changed: 0 additions & 28 deletions
This file was deleted.

0 commit comments

Comments
 (0)