[논문리뷰] Coupling Experts and Routers in Mixture-of-Experts via an Auxiliary LossarXiv에 게시된 'Coupling Experts and Routers in Mixture-of-Experts via an Auxiliary Loss' 논문에 대한 자세한 리뷰입니다.#Review#Mixture-of-Experts (MoE)#Router-Expert Coupling#Auxiliary Loss#Expert Specialization#Large Language Models (LLMs)#Computational Efficiency2025년 12월 29일댓글 수 로딩 중
[논문리뷰] Expertise need not monopolize: Action-Specialized Mixture of Experts for Vision-Language-Action LearningSijia Gu이 arXiv에 게시한 'Expertise need not monopolize: Action-Specialized Mixture of Experts for Vision-Language-Action Learning' 논문에 대한 자세한 리뷰입니다.#Review#Vision-Language-Action (VLA)#Mixture of Experts (MoE)#Robotic Manipulation#Expert Specialization#Decoupled Routing#Load Balancing#Transfer Learning2025년 10월 17일댓글 수 로딩 중