-
Notifications
You must be signed in to change notification settings - Fork 1.8k
Conversation
# or the result with the highest score (given by evaluator) will be the best result. | ||
|
||
# scheduler = PruningScheduler(pruner, task_generator, finetuner=finetuner, speed_up=True, dummy_input=dummy_input, evaluator=evaluator) | ||
scheduler = PruningScheduler(pruner, task_generator, finetuner=finetuner, speed_up=True, dummy_input=dummy_input, evaluator=None) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Interface is much more complicated than the original AGP Pruner
. It’s good to decouple the modules(Scheduler, generator), but suggest to provide a simple interface on the outermost layer.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, it is a good suggestion, I demo a high-level interface in #4236, we can discuss it in this pr.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@J-shang please schedule a meeting, let's discuss the user interface today
|
||
import torch | ||
from torch.nn import Module | ||
|
||
from nni.common.graph_utils import TorchModuleGraph | ||
from nni.compression.pytorch.utils import get_module_by_name | ||
from nni.algorithms.compression.v2.pytorch.utils.pruning import get_module_by_name |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Import path is too long~ Please remove pruning
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
thx, modify it~
|
||
# you can specify the log_dir, all intermidiate results and best result will save under this folder. | ||
# if you don't want to keep intermidiate results, you can set `keep_intermidiate_result=False`. | ||
task_generator = AGPTaskGenerator(10, model, config_list, log_dir='.', keep_intermidiate_result=True) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
'intermediate' or 'intermidiate'?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
oh, thank you, I have this mistake everywhere TT, all places have been fixed.
config_list = [{'op_types': ['Conv2d'], 'sparsity': 0.8}] | ||
|
||
# Make sure initialize task generator at first, this because the model pass to the generator should be an unwrapped model. | ||
# If you want to initialize pruner at first, you can use the follow code. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If user 'unwrap_model()' and initialize pruner at first, should user wrap_model()
again before scheduler.compress()
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No, in fact, the model
and config_list
pass to the pruner
won't be used at all. In the next update, I plan to support initialize pruner in this way for scheduler: Pruner(model=None, config_list=None, ...)
No description provided.