Layerwise Recurrent Router for Mixture-of-Experts Repo Overview We provide detailed analysis and code for comparing RMoE with other MoE methods in ./analysis folder. The ./llama_moe_rnn folder contains the implementation of RMoE in PyTorch based on the llama framework with transformers.