Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support/examples for converting or embedding Keras RNNs #204

Open
arvoelke opened this issue Feb 19, 2021 · 1 comment
Open

Support/examples for converting or embedding Keras RNNs #204

arvoelke opened this issue Feb 19, 2021 · 1 comment

Comments

@arvoelke
Copy link
Contributor

arvoelke commented Feb 19, 2021

Related:

In order to get a Keras RNN to work inside of NengoDL one needs to use a work-around where time is removed from the Nengo model (#122 (comment)) which currently isn't a documented solution. It's also not a general solution, for instance, if one wants to integrate this with other Nengo models. As a use case, suppose I want to take a keras_lmu.LMUCell and then hook it up to a Nengo subnetwork to filter and process the outputs of the RNN one step at a time. One could rewrite the keras_lmu.LMUCell using Nengo primitives, as in the NengoDL LMU example. But I'm suspecting it might be possible to embed the LMUCell within a TensorNode and call it one step at a time, or something similar. If so an example of this would be especially nice so that there is some support for this sort of use case.

@drasmuss
Copy link
Member

It's not currently possible to use an RNN cell inside a TensorNode (because you need some kind of mechanism to manage the state, fulfilling the role usually played by the Keras RNN layer). But adding support for that is definitely on the TODO list!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

No branches or pull requests

2 participants