-
Notifications
You must be signed in to change notification settings - Fork 69
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
some questions about the net.prototxt #17
Comments
The propagate_down flag affects the inputs to this layer, it does not stop this layer from learning: https://github.com/BVLC/caffe/blob/04ab089db018a292ae48d51732dd6c66766b36b6/src/caffe/proto/caffe.proto#L347-L354 |
thanks for your reply
By the way, the caffe code is too hard to learn.
I am trying to rewrite your up down code in pytorch
On 03/06/2019 22:47, Peter Anderson wrote:
The propagate_down flag affects the inputs to this layer, it does not stop this layer from learning: https://github.com/BVLC/caffe/blob/04ab089db018a292ae48d51732dd6c66766b36b6/src/caffe/proto/caffe.proto#L347-L354
The embedding layer is not pretrained and the propagate_down flag is not actually necessary here. I think it is left over from some experiments I was doing that involved backpropagating through beam search or something.
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub, or mute the thread.
|
@peteanderson80 HDF5-DIAG: Error detected in HDF5 (1.8.16) thread 139667420518144: |
I have the same problem, have you solved it? @zkself |
@GaoYifanGHB Hi, Gao. |
I have the same problem did you solve it? @GaoYifanGHB @zkself |
hello ! i'am very appreciate your nice work!
i have some question about your net.prototxt.
First,
layer {
name: "embedding"
type: "Embed"
bottom: "input"
top: "embedding"
param {
name: "embed_param"
}
propagate_down: false
embed_param {
num_output: 1000
input_dim: 10010
bias_term: false
weight_filler {
type: "gaussian"
std: 0.00999999977648
}
}
}
why
propagate_down: false?
is that means the embedding layer is pretrained? but,i have not found the clue in your code.
The text was updated successfully, but these errors were encountered: