a small suggestion to SegResNet #3331
function2-llx
started this conversation in
General
Replies: 2 comments 1 reply
-
Okay, I would say that it's enough for SegResNet to call the |
Beta Was this translation helpful? Give feedback.
0 replies
-
BTW, seems that the implementation of the def forward(self, x):
net_input = x
x = self.convInit(x)
if self.dropout_prob is not None:
x = self.dropout(x)
down_x = []
for down in self.down_layers:
x = down(x)
down_x.append(x)
down_x.reverse()
vae_input = x
for i, (up, upl) in enumerate(zip(self.up_samples, self.up_layers)):
x = up(x) + down_x[i + 1]
x = upl(x)
if self.use_conv_final:
x = self.conv_final(x)
if self.training:
vae_loss = self._get_vae_loss(net_input, vae_input)
return x, vae_loss
return x, None which could be rewritten with something like: def forward(self, x):
net_input = x
x, down_x = self.encode(x)
down_x.reverse()
vae_input = x
x = self.decode(x, down_x)
if self.training:
vae_loss = self._get_vae_loss(net_input, vae_input)
return x, vae_loss
return x, None |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi there,
If I'm understanding correctly, SegResNet is a ResNet encoder + U-Net decoder, so how about making SegResNet return the result of the encoder in
forward
? This would be useful if running classification and segmentation at the same time. So does SegResNetVAE.Hope to hear anyone's opinions on this!
Beta Was this translation helpful? Give feedback.
All reactions