You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In the paper you explicitly mention that on the upsampling path we shouldn't concatenate dense block's inputs and outputs. However, this is what's happen in the last dense block because of way the code is written:
#######################
# Upsampling path #
#######################
for i in range(n_pool):
# Transition Up ( Upsampling + concatenation with the skip connection)
n_filters_keep = growth_rate * n_layers_per_block[n_pool + i]
print('n_filters_keep', n_filters_keep)
stack = TransitionUp(skip_connection_list[i], block_to_upsample, n_filters_keep)
# Dense Block
block_to_upsample = []
for j in range(n_layers_per_block[n_pool + i + 1]):
l = BN_ReLU_Conv(stack, growth_rate, dropout_p=dropout_p)
block_to_upsample.append(l)
stack = ConcatLayer([stack, l])
Those loops work this way: while there are skip connections remaining, so the last line stack = ConcatLayer([stack, l]) have no effect: stack is discarded at the last iteration in the inner loop and later assigned new value: concatenated block_to_upsample and skip_connection. However, at the last iteration it is preserved and has not only output of the dense block but also its input and used as input to Softmax. Number of feature maps that way is the same as mentioned in paper: 256 (4*16=64 from dense block and 192 from latest TU block).
There is nothing wrong with it, I'm just confused whether is it indented behavior (and clever programming trick) or it is overlooked detail that leads to fine results. In either way, it worth updating the paper (or code) with some remark about how it supposed to work. Thank you!
The text was updated successfully, but these errors were encountered:
Hi, yes you're totally right, we concatenate the input for the last block because we want to take all the information for the output. However not concatenating leads to very similar results, and it's easy to convince yourself why (e.g. imagine the last convolution just copy the previous slice of feature maps which do take all the previous blocks into account). Thank you for your remark
In the paper you explicitly mention that on the upsampling path we shouldn't concatenate dense block's inputs and outputs. However, this is what's happen in the last dense block because of way the code is written:
Those loops work this way: while there are skip connections remaining, so the last line
stack = ConcatLayer([stack, l])
have no effect:stack
is discarded at the last iteration in the inner loop and later assigned new value: concatenatedblock_to_upsample
and skip_connection. However, at the last iteration it is preserved and has not only output of the dense block but also its input and used as input to Softmax. Number of feature maps that way is the same as mentioned in paper: 256 (4*16=64 from dense block and 192 from latest TU block).There is nothing wrong with it, I'm just confused whether is it indented behavior (and clever programming trick) or it is overlooked detail that leads to fine results. In either way, it worth updating the paper (or code) with some remark about how it supposed to work. Thank you!
The text was updated successfully, but these errors were encountered: