Skip to content

Commit aaaaec6

Browse files
authored
modify config name (#544)
1 parent 867a43d commit aaaaec6

10 files changed

+28
-43
lines changed

configs/coat/README.md

+5-5
Original file line numberDiff line numberDiff line change
@@ -14,8 +14,8 @@ Our reproduced model performance on ImageNet-1K is reported as follows.
1414

1515
| Model | Context | Top-1 (%) | Top-5 (%) | Params (M) | Recipe | Weight |
1616
|-----------------|-----------|-------|------------|------------------------------------------------------------------------------------------|----------------------------------------------------------------------------------|----------------------------------------------------------------------------------|
17-
| coat_lite_tiny | D910x8-G | 77.35 | 93.43 | 5.72 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/coat/coat_lite_tiny.yaml) | [weights](https://download.mindspore.cn/toolkits/mindcv/coat/coat_lite_tiny_fa7bf894.ckpt) |
18-
| coat_lite_mini | D910x8-G | 78.51 | 93.84 | 11.01 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/coat/coat_lite_mini.yaml) | [weights](https://download.mindspore.cn/toolkits/mindcv/coat/coat_lite_mini_55a52f05.ckpt) |
17+
| coat_lite_tiny | D910x8-G | 77.35 | 93.43 | 5.72 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/coat/coat_lite_tiny_ascend.yaml) | [weights](https://download.mindspore.cn/toolkits/mindcv/coat/coat_lite_tiny_fa7bf894.ckpt) |
18+
| coat_lite_mini | D910x8-G | 78.51 | 93.84 | 11.01 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/coat/coat_lite_mini_ascend.yaml) | [weights](https://download.mindspore.cn/toolkits/mindcv/coat/coat_lite_mini_55a52f05.ckpt) |
1919

2020
</div>
2121

@@ -42,7 +42,7 @@ It is easy to reproduce the reported results with the pre-defined training recip
4242

4343
```shell
4444
# distributed training on multiple GPU/Ascend devices
45-
mpirun -n 8 python train.py --config configs/coat/coat_lite_tiny.yaml --data_dir /path/to/imagenet
45+
mpirun -n 8 python train.py --config configs/coat/coat_lite_tiny_ascend.yaml --data_dir /path/to/imagenet
4646
```
4747

4848
> If the script is executed by the root user, the `--allow-run-as-root` parameter must be added to `mpirun`
@@ -59,15 +59,15 @@ If you want to train or finetune the model on a smaller dataset without distribu
5959

6060
```shell
6161
# standalone training on a CPU/GPU/Ascend device
62-
python train.py --config configs/coat/coat_lite_tiny.yaml --data_dir /path/to/dataset --distribute False
62+
python train.py --config configs/coat/coat_lite_tiny_ascend.yaml --data_dir /path/to/dataset --distribute False
6363
```
6464

6565
### Validation
6666

6767
To validate the accuracy of the trained model, you can use `validate.py` and parse the checkpoint path with `--ckpt_path`.
6868

6969
```shell
70-
python validate.py -c configs/coat/coat_lite_tiny.yaml --data_dir /path/to/imagenet --ckpt_path /path/to/ckpt
70+
python validate.py -c configs/coat/coat_lite_tiny_ascend.yaml --data_dir /path/to/imagenet --ckpt_path /path/to/ckpt
7171
```
7272

7373
### Deployment
File renamed without changes.
File renamed without changes.
File renamed without changes.

configs/rexnet/README.md

+8-8
Original file line numberDiff line numberDiff line change
@@ -14,11 +14,11 @@ Our reproduced model performance on ImageNet-1K is reported as follows.
1414

1515
| Model | Context | Top-1 (%) | Top-5 (%) | Params (M) | Recipe | Download |
1616
|-----------------|-----------|-------|-------|------------|------------------------------------------------------------------------------------------|----------------------------------------------------------------------------------|
17-
| rexnet_x09 | D910x8-G | 77.07 | 93.41 | 4.13 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/rexnet/rexnet_x09.yaml) | [weights](https://download.mindspore.cn/toolkits/mindcv/rexnet/rexnet0.9_acc77.07_bs64_8p.ckpt) |
18-
| rexnet_x10 | D910x8-G | 77.38 | 93.60 | 4.84 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/rexnet/rexnet_x10.yaml) | [weights](https://download.mindspore.cn/toolkits/mindcv/rexnet/rexnet1.0_acc77.4_bs64_8p.ckpt) |
19-
| rexnet_x13 | D910x8-G | 79.06 | 94.28 | 7.61 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/rexnet/rexnet_x13.yaml) | [weights](https://download.mindspore.cn/toolkits/mindcv/rexnet/rexnet1.3_acc79.06_bs64_8p.ckpt) |
20-
| rexnet_x15 | D910x8-G | 79.94 | 94.74 | 9.79 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/rexnet/rexnet_x15.yaml) | [weights](https://download.mindspore.cn/toolkits/mindcv/rexnet/rexnet1.5_acc79.94_bs64_8p.ckpt) |
21-
| rexnet_x20 | D910x8-G | 80.6 | 94.99 | 16.45 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/rexnet/rexnet_x20.yaml) | [weights](https://download.mindspore.cn/toolkits/mindcv/rexnet/rexnet2.0_acc80.6_bs64_8p.ckpt) |
17+
| rexnet_x09 | D910x8-G | 77.07 | 93.41 | 4.13 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/rexnet/rexnet_x09_ascend.yaml) | [weights](https://download.mindspore.cn/toolkits/mindcv/rexnet/rexnet0.9_acc77.07_bs64_8p.ckpt) |
18+
| rexnet_x10 | D910x8-G | 77.38 | 93.60 | 4.84 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/rexnet/rexnet_x10_ascend.yaml) | [weights](https://download.mindspore.cn/toolkits/mindcv/rexnet/rexnet1.0_acc77.4_bs64_8p.ckpt) |
19+
| rexnet_x13 | D910x8-G | 79.06 | 94.28 | 7.61 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/rexnet/rexnet_x13_ascend.yaml) | [weights](https://download.mindspore.cn/toolkits/mindcv/rexnet/rexnet1.3_acc79.06_bs64_8p.ckpt) |
20+
| rexnet_x15 | D910x8-G | 79.94 | 94.74 | 9.79 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/rexnet/rexnet_x15_ascend.yaml) | [weights](https://download.mindspore.cn/toolkits/mindcv/rexnet/rexnet1.5_acc79.94_bs64_8p.ckpt) |
21+
| rexnet_x20 | D910x8-G | 80.64 | 94.99 | 16.45 | [yaml](https://github.com/mindspore-lab/mindcv/blob/main/configs/rexnet/rexnet_x20_ascend.yaml) | [weights](https://download.mindspore.cn/toolkits/mindcv/rexnet/rexnet2.0_acc80.6_bs64_8p.ckpt) |
2222

2323
</div>
2424

@@ -45,7 +45,7 @@ It is easy to reproduce the reported results with the pre-defined training recip
4545

4646
```shell
4747
# distributed training on multiple GPU/Ascend devices
48-
mpirun -n 8 python train.py --config configs/rexnet/rexnet_x09.yaml --data_dir /path/to/imagenet
48+
mpirun -n 8 python train.py --config configs/rexnet/rexnet_x09_ascend.yaml --data_dir /path/to/imagenet
4949
```
5050

5151
> If the script is executed by the root user, the `--allow-run-as-root` parameter must be added to `mpirun`.
@@ -62,15 +62,15 @@ If you want to train or finetune the model on a smaller dataset without distribu
6262

6363
```shell
6464
# standalone training on a CPU/GPU/Ascend device
65-
python train.py --config configs/rexnet/rexnet_x09.yaml --data_dir /path/to/dataset --distribute False
65+
python train.py --config configs/rexnet/rexnet_x09_ascend.yaml --data_dir /path/to/dataset --distribute False
6666
```
6767

6868
### Validation
6969

7070
To validate the accuracy of the trained model, you can use `validate.py` and parse the checkpoint path with `--ckpt_path`.
7171

7272
```shell
73-
python validate.py -c configs/rexnet/rexnet_x09.yaml --data_dir /path/to/imagenet --ckpt_path /path/to/ckpt
73+
python validate.py -c configs/rexnet/rexnet_x09_ascend.yaml --data_dir /path/to/imagenet --ckpt_path /path/to/ckpt
7474
```
7575

7676
### Deployment

configs/rexnet/rexnet_x09.yaml configs/rexnet/rexnet_x09_ascend.yaml

+3-6
Original file line numberDiff line numberDiff line change
@@ -2,11 +2,11 @@
22
mode: 0
33
distribute: True
44
num_parallel_workers: 16
5-
device_target: "Ascend"
5+
val_while_train: True
6+
val_interval: 1
67

78
# dataset
89
dataset: "imagenet"
9-
data_url: "/cache/data"
1010
data_dir: "/path/to/imagenet"
1111
shuffle: True
1212
dataset_download: False
@@ -27,9 +27,6 @@ ckpt_path: ""
2727
keep_checkpoint_max: 10
2828
ckpt_save_dir: "./ckpt"
2929
ckpt_save_policy: "top_k"
30-
val_split: "val"
31-
val_while_train: True
32-
val_interval: 1
3330
epoch_size: 400
3431
dataset_sink_mode: True
3532
amp_level: "O2"
@@ -40,7 +37,7 @@ loss: "CE"
4037
label_smoothing: 0.1
4138

4239
# lr scheduler
43-
scheduler: "warmup_cosine_decay"
40+
scheduler: "cosine_decay"
4441
min_lr: 1.0e-6
4542
lr: 0.5
4643
warmup_epochs: 5

configs/rexnet/rexnet_x10.yaml configs/rexnet/rexnet_x10_ascend.yaml

+3-6
Original file line numberDiff line numberDiff line change
@@ -2,11 +2,11 @@
22
mode: 0
33
distribute: True
44
num_parallel_workers: 16
5-
device_target: "Ascend"
5+
val_while_train: True
6+
val_interval: 1
67

78
# dataset
89
dataset: "imagenet"
9-
data_url: "/cache/data"
1010
data_dir: "/path/to/imagenet"
1111
shuffle: True
1212
dataset_download: False
@@ -27,9 +27,6 @@ ckpt_path: ""
2727
keep_checkpoint_max: 10
2828
ckpt_save_dir: "./ckpt"
2929
ckpt_save_policy: "top_k"
30-
val_split: "val"
31-
val_while_train: True
32-
val_interval: 1
3330
epoch_size: 400
3431
dataset_sink_mode: True
3532
amp_level: "O2"
@@ -40,7 +37,7 @@ loss: "CE"
4037
label_smoothing: 0.1
4138

4239
# lr scheduler
43-
scheduler: "warmup_cosine_decay"
40+
scheduler: "cosine_decay"
4441
min_lr: 1.0e-6
4542
lr: 0.5
4643
warmup_epochs: 5

configs/rexnet/rexnet_x13.yaml configs/rexnet/rexnet_x13_ascend.yaml

+3-6
Original file line numberDiff line numberDiff line change
@@ -2,11 +2,11 @@
22
mode: 0
33
distribute: True
44
num_parallel_workers: 16
5-
device_target: "Ascend"
5+
val_while_train: True
6+
val_interval: 1
67

78
# dataset
89
dataset: "imagenet"
9-
data_url: "/cache/data"
1010
data_dir: "/path/to/imagenet"
1111
shuffle: True
1212
dataset_download: False
@@ -27,9 +27,6 @@ ckpt_path: ""
2727
keep_checkpoint_max: 10
2828
ckpt_save_dir: "./ckpt"
2929
ckpt_save_policy: "top_k"
30-
val_split: "val"
31-
val_while_train: True
32-
val_interval: 1
3330
epoch_size: 400
3431
dataset_sink_mode: True
3532
amp_level: "O2"
@@ -40,7 +37,7 @@ loss: "CE"
4037
label_smoothing: 0.1
4138

4239
# lr scheduler
43-
scheduler: "warmup_cosine_decay"
40+
scheduler: "cosine_decay"
4441
min_lr: 1.0e-6
4542
lr: 0.5
4643
warmup_epochs: 5

configs/rexnet/rexnet_x15.yaml configs/rexnet/rexnet_x15_ascend.yaml

+3-6
Original file line numberDiff line numberDiff line change
@@ -2,11 +2,11 @@
22
mode: 0
33
distribute: True
44
num_parallel_workers: 16
5-
device_target: "Ascend"
5+
val_while_train: True
6+
val_interval: 1
67

78
# dataset
89
dataset: "imagenet"
9-
data_url: "/cache/data"
1010
data_dir: "/path/to/imagenet"
1111
shuffle: True
1212
dataset_download: False
@@ -27,9 +27,6 @@ ckpt_path: ""
2727
keep_checkpoint_max: 10
2828
ckpt_save_dir: "./ckpt"
2929
ckpt_save_policy: "top_k"
30-
val_split: "val"
31-
val_while_train: True
32-
val_interval: 1
3330
epoch_size: 400
3431
dataset_sink_mode: True
3532
amp_level: "O2"
@@ -41,7 +38,7 @@ loss: "CE"
4138
label_smoothing: 0.1
4239

4340
# lr scheduler
44-
scheduler: "warmup_cosine_decay"
41+
scheduler: "cosine_decay"
4542
min_lr: 1.0e-6
4643
lr: 0.5
4744
warmup_epochs: 30

configs/rexnet/rexnet_x20.yaml configs/rexnet/rexnet_x20_ascend.yaml

+3-6
Original file line numberDiff line numberDiff line change
@@ -2,11 +2,11 @@
22
mode: 0
33
distribute: True
44
num_parallel_workers: 16
5-
device_target: "Ascend"
5+
val_while_train: True
6+
val_interval: 1
67

78
# dataset
89
dataset: "imagenet"
9-
data_url: "/cache/data"
1010
data_dir: "/path/to/imagenet"
1111
shuffle: True
1212
dataset_download: False
@@ -27,9 +27,6 @@ ckpt_path: ""
2727
keep_checkpoint_max: 10
2828
ckpt_save_dir: "./ckpt"
2929
ckpt_save_policy: "top_k"
30-
val_split: "val"
31-
val_while_train: True
32-
val_interval: 1
3330
epoch_size: 400
3431
dataset_sink_mode: True
3532
amp_level: "O2"
@@ -41,7 +38,7 @@ loss: "CE"
4138
label_smoothing: 0.1
4239

4340
# lr scheduler
44-
scheduler: "warmup_cosine_decay"
41+
scheduler: "cosine_decay"
4542
min_lr: 1.0e-6
4643
lr: 0.5
4744
warmup_epochs: 30

0 commit comments

Comments
 (0)