Skip to content
/ RMoE Public

Official implementation of RMoE (Layerwise Recurrent Router for Mixture-of-Experts)

Notifications You must be signed in to change notification settings

qiuzh20/RMoE

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 

Repository files navigation

Layerwise Recurrent Router for Mixture-of-Experts

Repo Overview

We provide detailed analysis and code for comparing RMoE with other MoE methods in ./analysis folder.

The ./llama_moe_rnn folder contains the implementation of RMoE in PyTorch based on the llama framework with transformers.

About

Official implementation of RMoE (Layerwise Recurrent Router for Mixture-of-Experts)

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published