You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have implemented this non-local attention block as shown in the above code, but the problem is that when I am using it in a network the batch-size is always None, so while using it for multiplication and reshaping is giving me error
The text was updated successfully, but these errors were encountered:
`def Nonlocalblock(x):
batch_size, height, width, in_channels = x.get_shape().as_list()
print("height",height)
print("width",width)
print("in_channels",in_channels)
#print("out_channels",out_channels)
print( "shape", x.get_shape())
`
I have implemented this non-local attention block as shown in the above code, but the problem is that when I am using it in a network the batch-size is always None, so while using it for multiplication and reshaping is giving me error
The text was updated successfully, but these errors were encountered: