Skip to content

Commit

Permalink
[Doc]Add doc for detect_anomalous_params (open-mmlab#6697)
Browse files Browse the repository at this point in the history
* add doc for detect_anomalous_params

* add min verision requirment

* fix version and key

* fix version
  • Loading branch information
jshilong authored and ZwwWayne committed Jul 18, 2022
1 parent ea4a7b0 commit d07387b
Show file tree
Hide file tree
Showing 2 changed files with 4 additions and 2 deletions.
3 changes: 2 additions & 1 deletion docs/faq.md
Original file line number Diff line number Diff line change
Expand Up @@ -81,7 +81,8 @@ We list some common troubles faced by many users and their corresponding solutio
- "RuntimeError: Expected to have finished reduction in the prior iteration before starting a new one"
1. This error indicates that your module has parameters that were not used in producing loss. This phenomenon may be caused by running different branches in your code in DDP mode.
2. You can set ` find_unused_parameters = True` in the config to solve the above problems or find those unused parameters manually.
2. You can set ` find_unused_parameters = True` in the config to solve the above problems(but this will slow down the training speed.
3. If the version of your MMCV >= 1.4.1, you can get the name of those unused parameters with `detect_anomalous_params=True` in `optimizer_config` of config.
## Evaluation
Expand Down
3 changes: 2 additions & 1 deletion docs_zh-CN/faq.md
Original file line number Diff line number Diff line change
Expand Up @@ -82,7 +82,8 @@
3. 使用 `config/fp16` 中的示例尝试混合精度训练。`loss_scale` 可能需要针对不同模型进行调整。
- "RuntimeError: Expected to have finished reduction in the prior iteration before starting a new one"
1. 这个错误出现在存在参数没有在 forward 中使用,容易在 DDP 中运行不同分支时发生。
2. 你可以在 config 设置 `find_unused_parameters = True`,或者手动查找哪些参数没有用到。
2. 你可以在 config 设置 `find_unused_parameters = True` 进行训练 (会降低训练速度)。
3. 你也可以通过在 config 中的 `optimizer_config` 里设置 `detect_anomalous_params=True` 查找哪些参数没有用到,但是需要 MMCV 的版本 >= 1.4.1。
## Evaluation 相关
Expand Down

0 comments on commit d07387b

Please sign in to comment.