Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

style: reduce bit misspell code comments README #2

Open
wants to merge 7 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -108,7 +108,7 @@ For unstructured pruning, we strictly follow the ones adopted by the state of th
k = 0.8000 0.6400 0.5120 0.4100 0.3280 0.2620 0.2097 0.1678 0.1342 0.1074 0.0859 0.0687 0.0550 0.0440 0.0350 0.0280 0.0225 0.0180 0.0144 0.0115 0.0092
```

Please do not be misled by this sparsity series. For each sparse ratio alone, the BiP prunes from dense model directly to the target sparsity ratio, rather than pruning based on the last checkpoint iteratively.
Please do not be mislead by this sparsity series. For each sparse ratio alone, the BiP prunes from dense model directly to the target sparsity ratio, rather than pruning based on the last checkpoint iteratively.

For structured pruning, we follow the linear sparsity ratio series (Figure 4, A6, and A7):

Expand Down
2 changes: 1 addition & 1 deletion args.py
Original file line number Diff line number Diff line change
Expand Up @@ -51,7 +51,7 @@ def parse_args():
"--scaled-score-init",
action="store_true",
default=False,
help="Init importance scores proportaional to weights (default kaiming init)",
help="Init importance scores proportional to weights (default kaiming init)",
)

parser.add_argument(
Expand Down
2 changes: 1 addition & 1 deletion models/basic.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ def forward(self, input):
return input.view(input.size(0), -1)


# lin_i: i layer linear feedforard network.
# lin_i: i layer linear feedforward network.
def lin_1(input_dim=3072, num_classes=10):
model = nn.Sequential(nn.Flatten(), nn.Linear(input_dim, num_classes))
return model
Expand Down
2 changes: 1 addition & 1 deletion models/resnet.py
Original file line number Diff line number Diff line change
Expand Up @@ -214,7 +214,7 @@ def forward(self, x):
return self._forward_impl(x)


# NOTE: Only supporting default (kaiming_init) initializaition.
# NOTE: Only supporting default (kaiming_init) initialization.
def ResNet18(conv_layer, linear_layer, **kwargs):
return ResNet(conv_layer, linear_layer, BasicBlock, [2, 2, 2, 2], **kwargs)

Expand Down
2 changes: 1 addition & 1 deletion models/resnets.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
which is implemented for ImageNet and doesn't have option A for identity.
Moreover, most of the implementations on the web is copy-paste from
torchvision's resnet and has wrong number of params.
Proper ResNet-s for CIFAR10 (for fair comparision and etc.) has following
Proper ResNet-s for CIFAR10 (for fair comparison and etc.) has following
number of layers and parameters:
name | layers | params
ResNet20 | 20 | 0.27M
Expand Down
2 changes: 1 addition & 1 deletion models/wrn_cifar.py
Original file line number Diff line number Diff line change
Expand Up @@ -162,7 +162,7 @@ def forward(self, x):
return self.fc(out)


# NOTE: Only supporting default (kaiming_init) initializaition.
# NOTE: Only supporting default (kaiming_init) initialization.
def wrn_28_10(conv_layer, linear_layer, **kwargs):
return WideResNet(conv_layer, linear_layer, depth=28, widen_factor=10, **kwargs)

Expand Down
2 changes: 1 addition & 1 deletion train.py
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@ def main():
if args.exp_mode in ["prune", "finetune"] and not args.resume:
assert args.source_net, "Provide checkpoint to prune/finetune"

# create resutls dir (for logs, checkpoints, etc.)
# create results dir (for logs, checkpoints, etc.)
result_main_dir = os.path.join(Path(args.result_dir), args.exp_name, args.exp_mode)

if os.path.exists(result_main_dir):
Expand Down