You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi! While it is relatively straightforward to implement higher level constructions like LSTM and GRU it's also nice to have a layer provided by the library like PyTorch does, having details like bidirectionality and dropout taken care of.
Are there any plans to include this in Lux as well?
Thanks!
The text was updated successfully, but these errors were encountered:
Sure I will accept PRs for this. The general layers policy we use is:
If a layer is present in Pytorch or Flax, we can have it in Lux (except some cases like Conv2D which is just Conv and such)
Boltz.jl is a better place for any other layers.
In this particular case, LSTM should return a AbstractLuxWrapperLayer containing the model (which can be written using Chain I think) and we just do a custom show method for nicer printing.
Hi! While it is relatively straightforward to implement higher level constructions like LSTM and GRU it's also nice to have a layer provided by the library like PyTorch does, having details like bidirectionality and dropout taken care of.
Are there any plans to include this in Lux as well?
Thanks!
The text was updated successfully, but these errors were encountered: