-
-
Notifications
You must be signed in to change notification settings - Fork 613
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Reverts #1557 #1561
Reverts #1557 #1561
Conversation
bors r+ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Believe this is correct other than the one test
@@ -40,17 +40,16 @@ import Flux: activations | |||
@test Dense(rand(Float16, 100,10), true).bias isa Vector{Float16} # creates matching type | |||
@test_skip Dense(rand(Float16, 100,10), rand(100)).bias isa Vector{Float16} # converts to match | |||
|
|||
@test_skip Dense(3,4; init=Base.randn, bias=true).bias isa Vector{Float64} | |||
@test Dense(3,4; init=Base.randn, bias=true).bias isa Vector{Float64} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This should be a @test_skip
right? The correct eltype
passing is an accident here (cause bias = true
uses zeros
which happens to be Float64
).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No. This was in #1558, and it uses Flux.zeros
(not Base.zeros
) which produces Float32
by default.
julia> Flux.zeros(4)
4-element Array{Float32,1}:
0.0
0.0
0.0
0.0
I agree that the test isn't explicit about its intent. This should be rewritten.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah I see, okay we'll address it separately. Thanks!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Reverting here is good, we should produce float64 bias when the weights produced are float64
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
we should produce float64 bias when the weights produced are float64
No
We should do as we are told by the user. Flux is meant to be hackable. While it is valid to have defaults, it is required to have overrides at different levels.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm referring to the specific case tested by the line we are commenting here. When doing Dense(3,4; init=Base.randn, bias=true)
the bias should be float64
bors r+ |
Okay so do I need to hit approve or is |
Build succeeded: |
Reverts #1557