MoE

Creator
Creator
Alan JoAlan Jo
Created
Created
2023 Apr 12 14:20
Editor
Editor
Alan JoAlan Jo
Edited
Edited
2024 May 8 17:19

Mixture-of-Experts

Mixture-of-Experts models improves efficiency by activating a small subset of model weights for a given input, decoupling model size from inference efficiency.
MoEs have seen great success in LLMs. In a nutshell, MoEs are pre-trained faster, and have a faster inference, but require more memory and face challenges in fine-tuning.
MoE Notion
 
 
 

Structure

1991
 
 

Recommendations