Update README.md
Browse filesSmall Mixture of Experts (MoE) trained on pure language modelling task (WikiText-103, document level). This MoE was presented in [this paper](https://aclanthology.org/2023.findings-emnlp.49/).
Small Mixture of Experts (MoE) trained on pure language modelling task (WikiText-103, document level). This MoE was presented in [this paper](https://aclanthology.org/2023.findings-emnlp.49/).