Uni-MoE: Scaling Unified Multimodal LLMs with Mixture of Experts
IEEE transactions on pattern analysis and machine intelligence(2025)
关键词
Mixture of Experts,Multimodal Large Language Model,Unified Framework,Training Strategy,Benchmark
AI 理解论文
溯源树
样例

生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要