Skip to content
This repository has been archived by the owner on Sep 18, 2024. It is now read-only.

Prettify the export format of NAS trainer #2389

Merged
merged 8 commits into from
May 11, 2020

Conversation

ultmaster
Copy link
Contributor

Pending #2386 as it uses APIs introduced in that PR. I coded on another branch merging #2386 and cherry picked the changes.

Note that this PR will be backward-compatible itself. Old checkpoints will still be valid.

This PR prettifies the export format of NAS trainer (point 2 of #2316). Here are three examples (of the new export format):

P-DARTS (first):

{
  "normal_n2_p0": "maxpool",
  "normal_n2_p1": "maxpool",
  "normal_n2_switch": [
    "normal_n2_p0",
    "normal_n2_p1"
  ],
  "normal_n3_p0": [],
  "normal_n3_p1": "maxpool",
  "normal_n3_p2": "maxpool",
  "normal_n3_switch": [
    "normal_n3_p1",
    "normal_n3_p2"
  ],
  "reduce_n2_p0": "maxpool",
  "reduce_n2_p1": "dilconv5x5",
  "reduce_n2_switch": [
    "reduce_n2_p0",
    "reduce_n2_p1"
  ],
  "reduce_n3_p0": "sepconv5x5",
  "reduce_n3_p1": "maxpool",
  "reduce_n3_p2": [],
  "reduce_n3_switch": [
    "reduce_n3_p0",
    "reduce_n3_p1"
  ]
}

ENAS macro:

{
  "InputChoice11": [
    "layer_2",
    "layer_3"
  ],
  "InputChoice13": "layer_4",
  "InputChoice15": [
    "layer_2",
    "layer_3"
  ],
  "InputChoice17": [
    "layer_2",
    "layer_4"
  ],
  "InputChoice19": [
    "layer_0",
    "layer_2",
    "layer_8"
  ],
  "InputChoice21": [
    "layer_0",
    "layer_1",
    "layer_7"
  ],
  "InputChoice23": [
    "layer_4",
    "layer_5",
    "layer_9"
  ],
  "InputChoice3": "layer_0",
  "InputChoice5": "layer_0",
  "InputChoice7": [],
  "InputChoice9": "layer_0",
  "LayerChoice1": 2,
  "LayerChoice10": 3,
  "LayerChoice12": 2,
  "LayerChoice14": 2,
  "LayerChoice16": 0,
  "LayerChoice18": 1,
  "LayerChoice2": 4,
  "LayerChoice20": 4,
  "LayerChoice22": 4,
  "LayerChoice4": 4,
  "LayerChoice6": 3,
  "LayerChoice8": 1
}

ENAS micro:

{
  "normal_node_0_x_input": 0,
  "normal_node_0_x_op": 3,
  "normal_node_0_y_input": 1,
  "normal_node_0_y_op": 4,
  "normal_node_1_x_input": 2,
  "normal_node_1_x_op": 0,
  "normal_node_1_y_input": 2,
  "normal_node_1_y_op": 0,
  "normal_node_2_x_input": 3,
  "normal_node_2_x_op": 2,
  "normal_node_2_y_input": 2,
  "normal_node_2_y_op": 0,
  "normal_node_3_x_input": 4,
  "normal_node_3_x_op": 2,
  "normal_node_3_y_input": 0,
  "normal_node_3_y_op": 1,
  "normal_node_4_x_input": 5,
  "normal_node_4_x_op": 2,
  "normal_node_4_y_input": 1,
  "normal_node_4_y_op": 1,
  "reduce_node_0_x_input": 0,
  "reduce_node_0_x_op": 1,
  "reduce_node_0_y_input": 1,
  "reduce_node_0_y_op": 1,
  "reduce_node_1_x_input": 0,
  "reduce_node_1_x_op": 1,
  "reduce_node_1_y_input": 0,
  "reduce_node_1_y_op": 0,
  "reduce_node_2_x_input": 0,
  "reduce_node_2_x_op": 4,
  "reduce_node_2_y_input": 1,
  "reduce_node_2_y_op": 0,
  "reduce_node_3_x_input": 4,
  "reduce_node_3_x_op": 0,
  "reduce_node_3_y_input": 1,
  "reduce_node_3_y_op": 4,
  "reduce_node_4_x_input": 4,
  "reduce_node_4_x_op": 2,
  "reduce_node_4_y_input": 0,
  "reduce_node_4_y_op": 1
}

@SparkSnail SparkSnail mentioned this pull request May 6, 2020
15 tasks
@QuanluZhang
Copy link
Contributor

please fix pylint error

@QuanluZhang
Copy link
Contributor

please update doc accordingly

"LayerChoice1": [false, true, false, false],
"InputChoice2": [true, true, false]
"LayerChoice1": "conv5x5",
"InputChoice2": [1, 2],
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what is the meaning of 1, 2? the index?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

so it could be either index or name? when it is index and when it is name?

@@ -33,6 +32,33 @@ def __init__(self, model, fixed_arc, strict=True):
raise RuntimeError("Unexpected keys found in fixed architecture: {}.".format(fixed_arc_keys - mutable_keys))
if mutable_keys - fixed_arc_keys:
raise RuntimeError("Missing keys in fixed architecture: {}.".format(mutable_keys - fixed_arc_keys))
self._fixed_arc = self._convert_human_readable_architecture(self._fixed_arc)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this function is convert to or from human readable architecture?

for mutable in self.mutables:
if mutable.key not in result_arc:
continue # skip silently
choice_arr = result_arc[mutable.key]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what is the meaning of "arr"?

@@ -33,6 +32,33 @@ def __init__(self, model, fixed_arc, strict=True):
raise RuntimeError("Unexpected keys found in fixed architecture: {}.".format(fixed_arc_keys - mutable_keys))
if mutable_keys - fixed_arc_keys:
raise RuntimeError("Missing keys in fixed architecture: {}.".format(mutable_keys - fixed_arc_keys))
self._fixed_arc = self._convert_human_readable_architecture(self._fixed_arc)

def _convert_human_readable_architecture(self, human_arc):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

please provide docstring for this function, though it is private.

@@ -185,17 +196,23 @@ def on_forward_input_choice(self, mutable, tensor_list):
mask = self._get_decision(mutable)
assert len(mask) == mutable.n_candidates, \
"Invalid mask, expected {} to be of length {}.".format(mask, mutable.n_candidates)
out = self._select_with_mask(lambda x: x, [(t,) for t in tensor_list], mask)
out, mask = self._select_with_mask(lambda x: x, [(t,) for t in tensor_list], mask)
return self._tensor_reduction(mutable.reduction, out), mask

def _select_with_mask(self, map_fn, candidates, mask):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the new implementation of this function has complex logic, please add docstring for this function.

out = [map_fn(*cand) for cand, m in zip(candidates, mask) if m]
elif "FloatTensor" in mask.type():
elif (isinstance(mask, list) and len(mask) >= 1 and isinstance(mask[0], (float, int))) or \
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it can be int?

@ultmaster ultmaster merged commit bf7daa8 into microsoft:master May 11, 2020
@SparkSnail SparkSnail mentioned this pull request May 19, 2020
15 tasks
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants