Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SSD #150

Closed
wants to merge 15 commits into from
Closed

SSD #150

wants to merge 15 commits into from

Conversation

yhcao6
Copy link
Collaborator

@yhcao6 yhcao6 commented Dec 7, 2018

No description provided.

ceil_mode=True,
out_indices=(3, 4),
out_feature_indices=(22, 34),
l2_dim=512,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This param can be obtained from previous layers.

@@ -13,7 +13,8 @@ def anchor_target(anchor_list,
cfg,
gt_labels_list=None,
cls_out_channels=1,
sampling=True):
sampling=True,
need_unmap=True):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Rename need_unmap to unmap.

# if use extra augmentation
if extra_aug is not None:
self.extra_aug = ExtraAugmentation(
img_norm_cfg.mean, img_norm_cfg.to_rgb, **extra_aug)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Set mean, std and to_rgb in config files so that we can initialize it with ExtraAugmentation(**extra_aug)

# extra augmentation
if self.extra_aug is not None:
img, gt_bboxes, gt_labels = self.extra_aug(
img.astype(np.float32), gt_bboxes, gt_labels)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Call img.astype(np.float32) inside aug transforms.

test_mode=False):
test_mode=False,
extra_aug=None,
keep_ratio_rescale=True):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe resize_keep_ratio sounds better.

self.n_dims = n_dims
self.weight = nn.Parameter(torch.Tensor(self.n_dims))
self.eps = eps
constant_init(self, scale)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It would be better to initialize weights outside the init method.

self.anchor_strides = anchor_strides
for k in range(len(anchor_strides)):
base_size = min_sizes[k]
s_k = base_size / 300
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hard coded number found.

if isinstance(m, nn.Conv2d):
nn.init.xavier_uniform_(m.weight)
if getattr(m, 'bias') is not None:
nn.init.constant_(m.bias, 0)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Use xavier_init instead.


class Expand(object):

def __init__(self, mean=(104, 117, 123), min_ratio=1, max_ratio=4):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Set default values to (0, 0, 0).

class ExtraAugmentation(object):

def __init__(self,
mean=(104, 117, 123),
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Default value issue.

@hellock
Copy link
Member

hellock commented Dec 9, 2018

#4

@yhcao6 yhcao6 closed this Dec 10, 2018
@yhcao6 yhcao6 deleted the ssd branch December 10, 2018 02:11
FANGAreNotGnu pushed a commit to FANGAreNotGnu/mmdetection that referenced this pull request Oct 23, 2023
* unzip return foldername or a tuple of foldernames
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants