Chrome Extension
WeChat Mini Program
Use on ChatGLM

UniMax: Fairer and More Effective Language Sampling for Large-Scale Multilingual Pretraining.

ICLR 2023(2023)

Cited 57|Views235
Key words
Keywords: multilingual,pretraining,language models,language sampling,language distribution,low-resource languages,overfitting
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined