-
Notifications
You must be signed in to change notification settings - Fork 18.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Rebase @ssafar's ReshapeLayer #2217
Conversation
+1 for this PR. I'm running it and it's working as advertised in my case. |
@jyegerlehner make it clear. now i get the repo of master branch after merge #2295 just now. but this patch rebased on many patches. How can i apply this patch to my local project ? |
Hmm, I think all I did was
|
@jyegerlehner thanks for your reply. I'm confused with what jeff does. not the git work flow. ssafar's repo has been stoped mantained for six month. jeff says his implementation is rebase of ssafar's #1263 。Now here is the question. i checkout the 2217 and jeffdonahue ssafar-reshape-rebase is enough? |
Oh, sorry. I have a talent for misunderstanding people.
I tried it, and caffe still built and passed tests. And then I added a layer with type "Reshape" to my prototxt and it solved my problem. I just set the reshape_param according to the comments for ReshapeParameter in caffe.proto. They look like this after the pull:
Pleasantly explicit, I would say. |
/// @brief the index of the axis whose dimension we infer, or -1 if none | ||
int inferred_axis_; | ||
/// @brief the product of the "constant" output dimensions | ||
int64_t constant_count_; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
TODO: change int64_t
to int
and merge (after discussion with @shelhamer and @longjon)
LGTM. @jeffdonahue will swap out the |
67266d0
to
21032b2
Compare
Rebase @ssafar's ReshapeLayer
This is a rebase of #1263 by @ssafar and replaces my simpler version in #2088. Thanks to @sguada's original suggestions and @ssafar's implementation in #1263, it supports all the fancy
dim: 0
(for copying input dimensions) anddim: -1
(to infer a dimension) logic. I also added the fieldsaxis
andnum_axes
to specify only reshaping a portion of the input blob (equivalent to settingdim: 0
for the remaining axes). e.g.,reshape_param { shape { dim: 1 } axis: 0 num_axes: 0 }
would simply insert a singleton axis at the beginning of the blob.