Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added global_pooling to set the kernel size equal to the bottom size #1214

Merged
merged 1 commit into from
Oct 3, 2014

Conversation

sguada
Copy link
Contributor

@sguada sguada commented Oct 3, 2014

It allows to do global pooling over the height x width of the bottom blob. So we omit the kernel_size which gets equal to bottom.

Given that the #594 allows on-the-fly resizing of the blobs, this would resize the kernel_size accordingly to the size of the bottom.

@Yangqing
Copy link
Member

Yangqing commented Oct 3, 2014

If it is not too much trouble - maybe we should extend it a little bit so that it not only does global pooling, but "spatial pooling" of various input sizes? For example, we can do 2x2 spatial pooling so that the output would be 2x2 regardless of the input shape.

@shelhamer
Copy link
Member

@Yangqing and anyone interested in binned / spatial pyramid pooling please see the discussion in #1210 starting withhttps://github.com//issues/1210#issuecomment-57723907.

I think it is best to start with this special case that's useful now then come up with a plan for full-blown SPP since everything else is just a case of that. We should check if the SPP authors will release their own Caffe layer code + include SPPnet in the model zoo too, since there is plenty of interest.

@sguada
Copy link
Contributor Author

sguada commented Oct 3, 2014

@Yangqing we could that, but based on the discussion at #1210 we decided to leave that for later.
Definitely there could be a binding parameter that define the number of bins indepently of the size of the bottom. Although there could be some rounding issures.

@Yangqing
Copy link
Member

Yangqing commented Oct 3, 2014

Sure, that sound good :)

@sguada
Copy link
Contributor Author

sguada commented Oct 3, 2014

I think it would be better to create binding pooling first and then the
multi-binding plus concat needed by SPP, instead that having it as special
case of SPP.
But let's merge this PR first and then follow the conversation at #1210

Sergio

2014-10-03 11:59 GMT-07:00 Yangqing Jia notifications@github.com:

Sure, that sound good :)


Reply to this email directly or view it on GitHub
#1214 (comment).

@shelhamer
Copy link
Member

@sguada this should work in most cases but to be exactly right shouldn't the kernel size take into account padding too?

@sguada
Copy link
Contributor Author

sguada commented Oct 3, 2014

@shelhamer I wasn't sure what to do with padding, I see three options:

  • Don't allow padding when doing global_pooling (since adding zeros don't add much)
  • Allow padding and just adding to the kernel_size so the output is still 1x1
  • Allow padding but don't add it to kernel_size so the output size will depend on the padding and stride. (Current)

I don't mind which one we take.

@shelhamer
Copy link
Member

Don't allow padding when doing global_pooling (since adding zeros don't add much)

Let's disallow padding and stride != 1 when global pooling: there's no point in padding or stride when the purpose is to cover the whole input in a single pool.

Please add checks for padding and stride then merge -- this looks good to me.

Added check for padding and stride with global_pooling
shelhamer added a commit that referenced this pull request Oct 3, 2014
Add global_pooling to cover the whole input by setting kernel size to bottom size
@shelhamer shelhamer merged commit 576931e into BVLC:dev Oct 3, 2014
@shelhamer
Copy link
Member

Thanks Sergio!

@sguada sguada deleted the global_pooling branch October 3, 2014 23:25
mitmul pushed a commit to mitmul/caffe that referenced this pull request Oct 11, 2014
Add global_pooling to cover the whole input by setting kernel size to bottom size
RazvanRanca pushed a commit to RazvanRanca/caffe that referenced this pull request Nov 4, 2014
Add global_pooling to cover the whole input by setting kernel size to bottom size
@shelhamer shelhamer mentioned this pull request Apr 15, 2015
@Coderx7
Copy link
Contributor

Coderx7 commented Sep 6, 2016

whats the use of global pooling ? I couldnt find any meaningful comments in the source codes!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants