[논문리뷰] Rewiring Experts on the Fly:Continuous Rerouting for Better Online Adaptation in Mixture-of-Expert modelsShiwei Liu이 arXiv에 게시한 'Rewiring Experts on the Fly:Continuous Rerouting for Better Online Adaptation in Mixture-of-Expert models' 논문에 대한 자세한 리뷰입니다.#Review#Mixture-of-Experts (MoE)#Online Adaptation#Test-Time Adaptation (TTA)#Expert Routing#Large Language Models (LLMs)#Self-Supervision#Computational Efficiency#Context Shift Robustness2025년 10월 20일댓글 수 로딩 중
[논문리뷰] Adapting Vision-Language Models Without Labels: A Comprehensive SurveyEleni Chatzi이 arXiv에 게시한 'Adapting Vision-Language Models Without Labels: A Comprehensive Survey' 논문에 대한 자세한 리뷰입니다.#Review#Vision-Language Models (VLMs)#Unsupervised Adaptation#Test-Time Adaptation (TTA)#Domain Transfer#Multimodal Learning#Label-Free Learning#Zero-Shot Learning2025년 8월 11일댓글 수 로딩 중