[논문리뷰] Routing Manifold Alignment Improves Generalization of Mixture-of-Experts LLMsZiyue Li이 arXiv에 게시한 'Routing Manifold Alignment Improves Generalization of Mixture-of-Experts LLMs' 논문에 대한 자세한 리뷰입니다.#Review#Mixture-of-Experts (MoE)#Large Language Models (LLMs)#Router Optimization#Manifold Regularization#Generalization#Post-training Fine-tuning#Task Embedding Alignment2025년 11월 10일댓글 수 로딩 중