[논문리뷰] Distribution-Aligned Sequence Distillation for Superior Long-CoT ReasoningarXiv에 게시된 'Distribution-Aligned Sequence Distillation for Superior Long-CoT Reasoning' 논문에 대한 자세한 리뷰입니다.#Review#Knowledge Distillation#Sequence-level Distillation#Chain-of-Thought Reasoning (CoT)#Large Language Models (LLMs)#Temperature-scheduled Learning#Divergence-aware Sampling#Mixed-policy Distillation#Open-source Models2026년 1월 14일댓글 수 로딩 중