Skip to content
This repository has been archived by the owner on Sep 18, 2024. It is now read-only.

[Model Compression] update config list key #4074

Merged
merged 13 commits into from
Aug 26, 2021

Conversation

J-shang
Copy link
Contributor

@J-shang J-shang commented Aug 16, 2021

In iterative mode, sparsity_per_layer is hard to satisfy the sparse expression of each iteration if speed up. This is because each layer may have different sparsity after speed up. Then in the next iteration, sparsity should be specified for each layer, sparsity_per_layer can not express this.

This pr keep the user interface but convert sparsity and sparsity_per_layer to total_sparsity inside pruner.

Convert max_sparsity_per_layer to min_retention_numel (Dict[op_name, int]) ?
Sparsity is element level?

@J-shang J-shang marked this pull request as ready for review August 18, 2021 03:23
sub_config['op_names'] = [op_name]
sub_config['total_sparsity'] = sparsity_per_layer
new_config_list.append(sub_config)
elif 'max_sparsity_per_layer' in config and isinstance(config['max_sparsity_per_layer'], float):
Copy link
Contributor

@xiaowu0162 xiaowu0162 Aug 20, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why do we need isinstance(config['max_sparsity_per_layer'], float) here?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If it is float, convert it to dict, skip otherwise.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

got it

new_schema = And(old_schema, lambda n: validate_op_names(model, n, logger))
sub_schema[k] = new_schema

sub_schema = And(data_schema[0], lambda d: validate_op_types_op_names(d))
Copy link
Contributor

@xiaowu0162 xiaowu0162 Aug 20, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this line might not modify the sub_schema object in the list data_schema. Is that the expected behavior? Not sure because I haven't worked a lot with Schema

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes, it's a bug, fixed it.

self.config_list = config_list_canonical(model, config_list)

def _validate_config_before_canonical(self, model: Module, config_list: List[Dict]):
pass
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what is this function used for?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this is for validating config logic for the sub-pruner, it is called in validate_config()

@@ -120,7 +120,7 @@ def calculate_metrics(self, data: Dict[str, Tensor]) -> Dict[str, Tensor]:

metric = torch.ones(*reorder_tensor.size()[:len(keeped_dim)], device=reorder_tensor.device)
across_dim = list(range(len(keeped_dim), len(reorder_dim)))
idxs = metric.nonzero()
idxs = metric.nonzero(as_tuple=False)
Copy link
Contributor

@QuanluZhang QuanluZhang Aug 21, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the default value of as_tuple is False, so why add it?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

because if don't set as_tuple, a warning will appear to ask for setting as_tuple value.

@J-shang J-shang closed this Aug 24, 2021
@J-shang J-shang reopened this Aug 24, 2021
Copy link
Contributor

@zheng-ningxin zheng-ningxin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good to go.

@J-shang J-shang merged commit 2b9f5f8 into microsoft:master Aug 26, 2021
@J-shang J-shang deleted the compression_v2_config branch September 13, 2021 08:27
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants