Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

remove some unused Conv constructors #1394

Merged
merged 1 commit into from
Dec 31, 2020
Merged

remove some unused Conv constructors #1394

merged 1 commit into from
Dec 31, 2020

Conversation

CarloLucibello
Copy link
Member

@CarloLucibello CarloLucibello commented Nov 14, 2020

I think these constructors can go, I suspect no one is using them and they weren't even tested.
Docstrings need some love, I won't do it here in order to avoid conflicts with #1391

As a general rule, we consistently provide Layer(W, b) constructors, so no need to also have Layer(; weight=W, bias=b) for an arbitrary subset of the layers.

PR Checklist

  • Tests are added
  • Entry in NEWS.md
  • Documentation, if applicable
  • Final review from @dhairyagandhi96 (for API changes).

@mcabbott
Copy link
Member

To be clear, the constructors this deletes are the all-keyword ones. Keeping Conv((3,3), 5=>7) and Conv(w, b; kw...).

@CarloLucibello
Copy link
Member Author

@DhairyaLGandhi this is waiting for approval

@DhairyaLGandhi
Copy link
Member

This doesn't remove any of the Dense constructors afaict so could update the news entry, and changes style which I would prefer to keep as the original.

Other than than, are we gaining much by removing the constructors?

@CarloLucibello
Copy link
Member Author

Other than than, are we gaining much by removing the constructors?

no big deal, just some polishing and consistency improvement

@CarloLucibello CarloLucibello added this to the v0.12 milestone Nov 27, 2020
@deprecate Conv(; weight, bias, σ=identity, kws...) Conv(weight, bias, σ; kws...)
@deprecate ConvTranspose(; weight, bias, σ=identity, kws...) ConvTranspose(weight, bias, σ; kws...)
@deprecate DepthwiseConv(; weight, bias, σ=identity, kws...) DepthwiseConv(weight, bias, σ; kws...)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

extra lines

function Dense(in::Integer, out::Integer, σ = identity;
initW = glorot_uniform, initb = zeros, bias=true)
function Dense(in::Integer, out::Integer, σ=identity;
initW=glorot_uniform, initb=zeros, bias=true)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

changes style - please use the consistent styling for all of Flux/ Zygote. I can appreciate having personal preferences, but I'd prefer to remain with how we have developed it thus far

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Notice that the spacing was inconsistent before. And that no blanks around = is not just my personal preference, it is widely adopted standard, see julia Base or https://github.com/invenia/BlueStyle#whitespace. I'd like to align our codebase to that

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Even if it's not personal preference, it's unwieldy to change these things around if we have an established pattern we use.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Unless there's any real reason to do so, we can move ahead with the established pattern and revisit if we ever need to change it, and do it consistently then

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's not really clear there is an established pattern. at present, there isn't much consistence, see the Losses module, the dropout function, normalize,....

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should fix it there as well, in that case.

@CarloLucibello
Copy link
Member Author

@DhairyaLGandhi are we good here?

@CarloLucibello
Copy link
Member Author

bump

src/layers/conv.jl Outdated Show resolved Hide resolved
src/layers/conv.jl Outdated Show resolved Hide resolved
src/layers/conv.jl Outdated Show resolved Hide resolved
src/layers/conv.jl Outdated Show resolved Hide resolved
src/layers/conv.jl Outdated Show resolved Hide resolved
src/layers/conv.jl Outdated Show resolved Hide resolved
src/layers/conv.jl Outdated Show resolved Hide resolved
src/layers/conv.jl Outdated Show resolved Hide resolved
src/layers/basic.jl Outdated Show resolved Hide resolved
NEWS.md Outdated Show resolved Hide resolved
src/layers/conv.jl Outdated Show resolved Hide resolved
src/layers/conv.jl Outdated Show resolved Hide resolved
@DhairyaLGandhi
Copy link
Member

Seems like we either need to fix the docs or these constructors were actually used

@CarloLucibello CarloLucibello force-pushed the convt3 branch 2 times, most recently from 39a6a16 to c82dfc5 Compare December 18, 2020 10:34
@CarloLucibello
Copy link
Member Author

I fixed some rebase mess I created, squashed the commits, fixed documentation and deprecation issues.
This should be finally ready to merge when CI goes green

@CarloLucibello
Copy link
Member Author

@DhairyaLGandhi now should be ready ready :)

@CarloLucibello
Copy link
Member Author

@DhairyaLGandhi bump

@DhairyaLGandhi
Copy link
Member

bors r+

bors bot added a commit that referenced this pull request Dec 30, 2020
1394: remove some unused Conv constructors r=DhairyaLGandhi a=CarloLucibello

I think these constructors can go, I suspect no one is using them and they weren't even tested. 
Docstrings need some love, I won't do it here in order to avoid conflicts with #1391 

As a general rule, we consistently provide `Layer(W, b)` constructors, so no need to also have `Layer(; weight=W, bias=b)` for an arbitrary subset of the layers. 

### PR Checklist

- [ ] Tests are added
- [x] Entry in NEWS.md
- [ ] Documentation, if applicable
- [ ] Final review from `@dhairyagandhi96` (for API changes).


Co-authored-by: Carlo Lucibello <carlo.lucibello@gmail.com>
@CarloLucibello
Copy link
Member Author

@DhairyaLGandhi needs a review approval, otherwise bors won't be able to merge given current github's settings

@CarloLucibello
Copy link
Member Author

rebased

@DhairyaLGandhi
Copy link
Member

bors r+

@bors
Copy link
Contributor

bors bot commented Dec 31, 2020

Build succeeded!

And happy new year! 🎉

@bors bors bot merged commit ef1aa49 into master Dec 31, 2020
@CarloLucibello
Copy link
Member Author

Ahahh ah, thank you bors

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants