Skip to content
This repository has been archived by the owner on Sep 18, 2024. It is now read-only.

[Model Compression] Add Unit Test #4125

Merged
merged 19 commits into from
Oct 12, 2021
Merged

Conversation

J-shang
Copy link
Contributor

@J-shang J-shang commented Aug 30, 2021

No description provided.

@J-shang J-shang marked this pull request as ready for review September 26, 2021 06:16
@QuanluZhang QuanluZhang requested review from zheng-ningxin and removed request for QuanluZhang September 27, 2021 00:44
# or the result with the highest score (given by evaluator) will be the best result.

# scheduler = PruningScheduler(pruner, task_generator, finetuner=finetuner, speed_up=True, dummy_input=dummy_input, evaluator=evaluator)
scheduler = PruningScheduler(pruner, task_generator, finetuner=finetuner, speed_up=True, dummy_input=dummy_input, evaluator=None)
Copy link
Contributor

@zheng-ningxin zheng-ningxin Oct 6, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Interface is much more complicated than the original AGP Pruner. It’s good to decouple the modules(Scheduler, generator), but suggest to provide a simple interface on the outermost layer.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, it is a good suggestion, I demo a high-level interface in #4236, we can discuss it in this pr.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@J-shang please schedule a meeting, let's discuss the user interface today


import torch
from torch.nn import Module

from nni.common.graph_utils import TorchModuleGraph
from nni.compression.pytorch.utils import get_module_by_name
from nni.algorithms.compression.v2.pytorch.utils.pruning import get_module_by_name
Copy link
Contributor

@zheng-ningxin zheng-ningxin Oct 6, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Import path is too long~ Please remove pruning

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thx, modify it~

@QuanluZhang QuanluZhang requested a review from liuzhe-lz October 8, 2021 09:48

# you can specify the log_dir, all intermidiate results and best result will save under this folder.
# if you don't want to keep intermidiate results, you can set `keep_intermidiate_result=False`.
task_generator = AGPTaskGenerator(10, model, config_list, log_dir='.', keep_intermidiate_result=True)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

'intermediate' or 'intermidiate'?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

oh, thank you, I have this mistake everywhere TT, all places have been fixed.

config_list = [{'op_types': ['Conv2d'], 'sparsity': 0.8}]

# Make sure initialize task generator at first, this because the model pass to the generator should be an unwrapped model.
# If you want to initialize pruner at first, you can use the follow code.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If user 'unwrap_model()' and initialize pruner at first, should user wrap_model() again before scheduler.compress()?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No, in fact, the model and config_list pass to the pruner won't be used at all. In the next update, I plan to support initialize pruner in this way for scheduler: Pruner(model=None, config_list=None, ...)

@J-shang J-shang merged commit e3e17f4 into microsoft:master Oct 12, 2021
@J-shang J-shang deleted the compression_v2_ut branch October 25, 2021 03:25
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants