You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
So, coincidentally, I've also done some experiments using Mamba instead of a Transformer for Decision Mamba (most recent repo can be found here: https://github.com/lmco/DecisionMamba). Have you tried using Mamba's inference_params during evals (essentially, the 'recurrent' mode for Mamba) instead of the parallel mode (with the restricted context length)? It's something I was testing, but I didn't get very far with it.
The text was updated successfully, but these errors were encountered:
So, coincidentally, I've also done some experiments using Mamba instead of a Transformer for Decision Mamba (most recent repo can be found here: https://github.com/lmco/DecisionMamba). Have you tried using Mamba's
inference_params
during evals (essentially, the 'recurrent' mode for Mamba) instead of the parallel mode (with the restricted context length)? It's something I was testing, but I didn't get very far with it.The text was updated successfully, but these errors were encountered: