Skip to content

Commit

Permalink
Fix broken link DeepSpeed documentation link (#18783)
Browse files Browse the repository at this point in the history
* Fix broken link

* Trigger CI

Co-authored-by: Stas Bekman <stas@stason.org>
  • Loading branch information
philschmid and stas00 authored Aug 29, 2022
1 parent 21f6f58 commit f2fbe44
Showing 1 changed file with 1 addition and 2 deletions.
3 changes: 1 addition & 2 deletions docs/source/en/main_classes/deepspeed.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,7 @@ won't be possible on a single GPU.
2. If you don't use [`Trainer`] and want to use your own Trainer where you integrated DeepSpeed
yourself, core functionality functions like `from_pretrained` and `from_config` include integration of essential
parts of DeepSpeed like `zero.Init` for ZeRO stage 3 and higher. To tap into this feature read the docs on
[deepspeed-non-trainer-integration](#deepspeed-non-trainer-integration).
[non-Trainer DeepSpeed Integration](#nontrainer-deepspeed-integration).

What is integrated:

Expand Down Expand Up @@ -1849,7 +1849,6 @@ In this case you usually need to raise the value of `initial_scale_power`. Setti



<a id='deepspeed-non-trainer-integration'></a>

## Non-Trainer Deepspeed Integration

Expand Down

0 comments on commit f2fbe44

Please sign in to comment.