This repo implements a Mixture of Experts layer and compares the performance of a BiLSTM model with and without the MoE layer on the CoNLL 2003 dataset.
-
Notifications
You must be signed in to change notification settings - Fork 0
This repo implements a Mixture of Experts layer and compares the performance of a BiLSTM model with and without the MoE layer on the CoNLL 2003 dataset.
License
aaronrockmenezes/Mixture-of-Experts-Implementation
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
This repo implements a Mixture of Experts layer and compares the performance of a BiLSTM model with and without the MoE layer on the CoNLL 2003 dataset.
Resources
License
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published