Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix unused parameters #23

Open
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

siyuanliii
Copy link

Minor fix of some unused parameters which will cause problems in DDP training.

@janthmueller
Copy link

Wow, would have loved to see your commit a week earlier. Thanks! When I compared my rebuilt with the original implementation via netron and named parameters, I now also noticed the unnecessary projections on stages with a depth of one. Now the only thing missing is removing unnecessary max pooling operations and adjusting the loading of the pre-trained network.

@Dawn-bin
Copy link

Hi guys, I have solved this problem. The bug is on this (part)[https://github.com/ucbdrive/dla/blob/master/dla.py#L206].

    def forward(self, x, residual=None, children=None):
        children = [] if children is None else children
        bottom = self.downsample(x) if self.downsample else x
        residual = self.project(bottom) if self.project else bottom
        if self.level_root:
            children.append(bottom)
        x1 = self.tree1(x, residual)
        if self.levels == 1:
            x2 = self.tree2(x1)
            x = self.root(x2, x1, *children)
        else:
            children.append(x1)
            x = self.tree2(x1, children=children)
        return x

residual=None is pass from self.tree1(x, residual) from the outlayer, but not used. It is useful when x1=self.tree1(x, residual) is the last inner layer, that a project layer.
we can fix this bug at (here)[https://github.com/ucbdrive/dla/blob/master/dla.py#L199]

if in_channels != out_channels and (levels in [1, ]):

@@ -196,7 +196,7 @@ def __init__(self, levels, block, in_channels, out_channels, stride=1,
self.levels = levels
if stride > 1:
self.downsample = nn.MaxPool2d(stride, stride=stride)
if in_channels != out_channels:
if in_channels != out_channels and levels != 1:

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

if in_channels != out_channels and (levels in [1, ]):

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants