-
Notifications
You must be signed in to change notification settings - Fork 1.8k
[Model Compression] update config list key #4074
Conversation
6f26574
to
2d3ab7e
Compare
sub_config['op_names'] = [op_name] | ||
sub_config['total_sparsity'] = sparsity_per_layer | ||
new_config_list.append(sub_config) | ||
elif 'max_sparsity_per_layer' in config and isinstance(config['max_sparsity_per_layer'], float): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why do we need isinstance(config['max_sparsity_per_layer'], float)
here?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If it is float, convert it to dict, skip otherwise.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
got it
new_schema = And(old_schema, lambda n: validate_op_names(model, n, logger)) | ||
sub_schema[k] = new_schema | ||
|
||
sub_schema = And(data_schema[0], lambda d: validate_op_types_op_names(d)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this line might not modify the sub_schema
object in the list data_schema
. Is that the expected behavior? Not sure because I haven't worked a lot with Schema
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes, it's a bug, fixed it.
self.config_list = config_list_canonical(model, config_list) | ||
|
||
def _validate_config_before_canonical(self, model: Module, config_list: List[Dict]): | ||
pass |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
what is this function used for?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this is for validating config logic for the sub-pruner, it is called in validate_config()
@@ -120,7 +120,7 @@ def calculate_metrics(self, data: Dict[str, Tensor]) -> Dict[str, Tensor]: | |||
|
|||
metric = torch.ones(*reorder_tensor.size()[:len(keeped_dim)], device=reorder_tensor.device) | |||
across_dim = list(range(len(keeped_dim), len(reorder_dim))) | |||
idxs = metric.nonzero() | |||
idxs = metric.nonzero(as_tuple=False) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
the default value of as_tuple
is False, so why add it?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
because if don't set as_tuple
, a warning will appear to ask for setting as_tuple
value.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good to go.
In iterative mode,
sparsity_per_layer
is hard to satisfy the sparse expression of each iteration ifspeed up
. This is because each layer may have different sparsity afterspeed up
. Then in the next iteration, sparsity should be specified for each layer,sparsity_per_layer
can not express this.This pr keep the user interface but convert
sparsity
andsparsity_per_layer
tototal_sparsity
inside pruner.Convert
max_sparsity_per_layer
tomin_retention_numel
(Dict[op_name, int]) ?Sparsity is element level?