Skip to content

This repo implements a Mixture of Experts layer and compares the performance of a BiLSTM model with and without the MoE layer on the CoNLL 2003 dataset.

License

Notifications You must be signed in to change notification settings

aaronrockmenezes/Mixture-of-Experts-Implementation

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Mixture-of-Experts-Implementation

This repo implements a Mixture of Experts layer and compares the performance of a BiLSTM model with and without the MoE layer on the CoNLL 2003 dataset.

About

This repo implements a Mixture of Experts layer and compares the performance of a BiLSTM model with and without the MoE layer on the CoNLL 2003 dataset.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published