Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

LogLayer #2090

Merged
merged 1 commit into from
Jun 3, 2015
Merged

LogLayer #2090

merged 1 commit into from
Jun 3, 2015

Conversation

jeffdonahue
Copy link
Contributor

This adds LogLayer, a NeuronLayer, which by default takes the natural log of its inputs. It's designed analogously to ExpLayer and PowerLayer. (In general computes log_{\gamma}(\alpha x + \beta) with log_param { base: \gamma scale: \alpha shift: \beta }.)

@jeffdonahue jeffdonahue changed the title Log Layer LogLayer Mar 10, 2015
@longjon
Copy link
Contributor

longjon commented Mar 10, 2015

@pannous I do think LogLayer is the best name here; we have ExpLayer, not ExponentiationLayer, log is the commonly used mathematical abbreviation, Caffe has distinct mechanisms for logging and an existing nomenclature for output layers.

@@ -33,6 +33,7 @@ extern "C" {

DEFINE_VSL_UNARY_FUNC(Sqr, y[i] = a[i] * a[i]);
DEFINE_VSL_UNARY_FUNC(Exp, y[i] = exp(a[i]));
DEFINE_VSL_UNARY_FUNC(Ln, y[i] = log(a[i]));
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I kinda prefer the name log here, sticking with the name of the elementwise call, but either one is okay...

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

not my choice -- the MKL function is vsLn/vdLn.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh. :(

@longjon
Copy link
Contributor

longjon commented May 14, 2015

This looks good to me as a counterpart to ExpLayer.

As a more general issue affecting both of these, however, I wonder if this is the best way to resolve the tension between granularity and functionality. As far as I can tell, these layers could just implement simple exp and log, with scale and shift factored out (in-place, if desired), and with base changes factored out, as that's how the implementations work already...

@jeffdonahue
Copy link
Contributor Author

@longjon yeah, I put the shift/scale fields in to match the existing Exp and Power layers and because in my own usage I often end up wanting them, but, especially with your Python layer specs making it more convenient to do compositions of simpler layers, it might be time to factor them out (i.e. create ShiftLayer and ScaleLayer and remove those fields from Exp, Power, and Log). I guess my preference would be to merge this as it's consistent with what we have now, then follow up with (or let someone else follow up with...) that refactoring later, but maybe it's better to not add to the existing mess -- I'll leave that up to you.

@shelhamer
Copy link
Member

Looks good to me.

it might be time to factor them out (i.e. create ShiftLayer and ScaleLayer and remove those fields from Exp, Power, and Log). I guess my preference would be to merge this as it's consistent with what we have now, then follow up with (or let someone else follow up with...) that refactoring later

This is fine. It could make for a nice warm-up PR in the future.

@jeffdonahue rebase and merge away.

jeffdonahue added a commit that referenced this pull request Jun 3, 2015
@jeffdonahue jeffdonahue merged commit 2d137e1 into BVLC:master Jun 3, 2015
@jeffdonahue
Copy link
Contributor Author

Thanks for the review @shelhamer and @longjon!

@jeffdonahue jeffdonahue deleted the log-layer branch June 3, 2015 02:36
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants