-
Notifications
You must be signed in to change notification settings - Fork 18.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Layer type is a string #1694
Layer type is a string #1694
Conversation
Fine by me. |
I like layer and type too On Wednesday, January 7, 2015, Evan Shelhamer notifications@github.com
Sergio |
Hooray!
Really? |
After some discussion with @longjon I've concluded that I too think case insensitive type strings would be overly user-friendly. Sorry for that brief lapse in judgment! |
39bfcf1
to
d5839a9
Compare
Any I also plan to clean up all the numbers so that everything is in a nice order (not that it will stay that way, but can at least make it start with |
To me these ones are less meaningful, than current. |
Re: naming:
|
Thanks for the well thought out suggestions @longjon. I had a thought on another option that would make the param/blob prefix redundant:
I think having a separate |
07aa37e
to
958e372
Compare
Other than needing to run the updated automagic upgrade tool ( |
06e8130
to
f719850
Compare
Took a pass, just the one issue noted. The upgrade code is nasty, but maybe it has to be that way, and it's not forever...
|
Thanks for looking over this @longjon!
Yeah, I realized this was kind of weird after doing it... But, I rely on it for the |
Merged after discussion with @longjon and @shelhamer. |
Introduced by Layer type is a string BVLC#1694
Introduced by Layer type is a string #1694
* master: (21 commits) Update docs for ND blobs (BVLC#1970) and layer type is a string (BVLC#1694) Add ReshapeParameter axis and num_axes to reshape only a particular span of the input shape basic tests (Forward, Gradient) for ReshapeLayer ReshapeLayer fixups for ND blobs Added a Reshape layer for copying-free modification of blob dimensions. Spatial Pyramid Pooling Layer remove bogus implementation of SigmoidCrossEntropyLossLayer::Forward_gpu remove superfluous empty destructors [pycaffe] use bp::object instead of PyObject* for self in Python layer python: PEP8; changed docstring documentation style to NumPyDoc style This imports the wrong io module in Python 3. check that count_ does not overflow in Blob::Reshape Modify for better readability regarding temporary bufffer for backward computation Fix redundancy of parameter backward computation Added support for original implementation, using (margin - d^2), through the legacy_version parameter. added epsilon to prevent possible division by zero in gradient calculation Fixed contrastive loss layer to be the same as proposed in Hadsell et al 2006 remove spurious net.hpp includes always call Layer::Reshape in Layer::Forward Increment iter_ before snapshotting, remove +1 logic -- fixes final snapshot being off by one ...
addresses #1685 (thanks for the reminder @shelhamer)
Mostly works, but I still need to implement automagic upgrading of net prototxts before this is merged. Which is actually kind of hard, because
type
is currently anenum
so the parse of an old prototxt will fail due to the lack of quotes around the enum value (which was why I'd suggested using a different name likeclass
, but I didn't particularly like the nameclass
either). One possible path that I think would probably work is to change the name of thelayers
field inNetParameter
, but keep that field to mark the use of a "V1" prototxt (with this new one being "V2"). I've wanted to do this because it kind of annoys me that it's plural --layers
-- so if others agree this could be a good excuse to change the name tolayer
.To do:
layers
field name tolayer
as discussed abovetype()
overridesupgrade_net_proto_text
toolMake the type case insensitive