Skip to content

Commit e4223fa

Browse files
🌐 [i18n-KO] Translated main_classes/optimizer_schedules.md to Korean (#39713)
* docs: ko: main_classes/optimizer_schedules * feat: nmt draft * fix: improve TOC anchors and expressions in optimizer_schedules - Add TOC anchors to all section headers - Fix terminology and improve Korean expressions * fix: Correct translation of 'weight decay fixed' to '가중치 감쇠가 적용된' Changed '가중치 감쇠가 수정된' to '가중치 감쇠가 적용된' for more accurate translation of 'weight decay fixed' in the context of optimization. * fix: Use more natural Korean inheritance expression Changed '에서 상속받는' to '을 상속받는' to follow natural Korean grammar patterns for inheritance terminology. * fix: Use consistent '미세 조정' translation for 'finetuned models' Changed '파인튜닝된' to '미세 조정된 모델' to follow the established translation glossary for 'finetuned models' terminology.
1 parent 9e21e50 commit e4223fa

File tree

2 files changed

+78
-2
lines changed

2 files changed

+78
-2
lines changed

docs/source/ko/_toctree.yml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -412,8 +412,8 @@
412412
title: 텍스트 생성
413413
- local: main_classes/onnx
414414
title: ONNX
415-
- local: in_translation
416-
title: (번역중) Optimization
415+
- local: main_classes/optimizer_schedules
416+
title: 최적화
417417
- local: main_classes/output
418418
title: 모델 출력
419419
- local: main_classes/peft
Lines changed: 76 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,76 @@
1+
<!--Copyright 2020 The HuggingFace Team. All rights reserved.
2+
3+
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with
4+
the License. You may obtain a copy of the License at
5+
6+
http://www.apache.org/licenses/LICENSE-2.0
7+
8+
Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on
9+
an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the
10+
specific language governing permissions and limitations under the License.
11+
12+
⚠️ Note that this file is in Markdown but contain specific syntax for our doc-builder (similar to MDX) that may not be
13+
rendered properly in your Markdown viewer.
14+
15+
-->
16+
17+
# 최적화[[optimization]]
18+
19+
`.optimization` 모듈은 다음을 제공합니다:
20+
21+
- 미세 조정된 모델에 사용할 수 있는 가중치 감쇠가 적용된 옵티마이저
22+
- `_LRSchedule`을 상속받는 스케줄 객체 형태의 여러 스케줄
23+
- 여러 배치의 그래디언트를 누적하는 그래디언트 누적 클래스
24+
25+
26+
## AdaFactor (PyTorch)[[transformers.Adafactor]]
27+
28+
[[autodoc]] Adafactor
29+
30+
## AdamWeightDecay (TensorFlow)[[transformers.AdamWeightDecay]]
31+
32+
[[autodoc]] AdamWeightDecay
33+
34+
[[autodoc]] create_optimizer
35+
36+
## 스케줄[[schedules]]
37+
38+
### 학습률 스케줄 (PyTorch)[[transformers.SchedulerType]]
39+
40+
[[autodoc]] SchedulerType
41+
42+
[[autodoc]] get_scheduler
43+
44+
[[autodoc]] get_constant_schedule
45+
46+
[[autodoc]] get_constant_schedule_with_warmup
47+
48+
<img alt="" src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/warmup_constant_schedule.png"/>
49+
50+
[[autodoc]] get_cosine_schedule_with_warmup
51+
52+
<img alt="" src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/warmup_cosine_schedule.png"/>
53+
54+
[[autodoc]] get_cosine_with_hard_restarts_schedule_with_warmup
55+
56+
<img alt="" src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/warmup_cosine_hard_restarts_schedule.png"/>
57+
58+
[[autodoc]] get_linear_schedule_with_warmup
59+
60+
<img alt="" src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/warmup_linear_schedule.png"/>
61+
62+
[[autodoc]] get_polynomial_decay_schedule_with_warmup
63+
64+
[[autodoc]] get_inverse_sqrt_schedule
65+
66+
[[autodoc]] get_wsd_schedule
67+
68+
### 웜업 (TensorFlow)[[transformers.WarmUp]]
69+
70+
[[autodoc]] WarmUp
71+
72+
## 그래디언트 전략[[gradient-strategies]]
73+
74+
### GradientAccumulator (TensorFlow)[[transformers.GradientAccumulator]]
75+
76+
[[autodoc]] GradientAccumulator

0 commit comments

Comments
 (0)