Skip to content

Latest commit

 

History

History
7 lines (4 loc) · 471 Bytes

211210 Building a great multi-lingual teacher with sparsely-gated mixture of experts for speech recognition.md

File metadata and controls

7 lines (4 loc) · 471 Bytes

https://arxiv.org/abs/2112.05820

Building a great multi-lingual teacher with sparsely-gated mixture of experts for speech recognition (Kenichi Kumatani, Robert Gmyr, Felipe Cruz Salinas, Linquan Liu, Wei Zuo, Devang Patel, Eric Sun, Yu Shi)

마침 또 moe를 multilingual asr에 적용한 결과가. language 별 특화 모델과 multilingual 모델 사이의 접점 혹은 절충으로 moe가 효율적인 접근이 될 수도 있겠다 싶고 그렇네요.

#moe #asr