[논문리뷰] Teaching Pretrained Language Models to Think Deeper with Retrofitted RecurrencearXiv에 게시된 'Teaching Pretrained Language Models to Think Deeper with Retrofitted Recurrence' 논문에 대한 자세한 리뷰입니다.#Review#Recurrent Language Models#Pretrained Models#Model Surgery#Curriculum Learning#Test-Time Compute Scaling#Mathematics Reasoning#Efficient Training#Depth Recurrence2025년 11월 10일댓글 수 로딩 중
[논문리뷰] Chronos-2: From Univariate to Universal ForecastingarXiv에 게시된 'Chronos-2: From Univariate to Universal Forecasting' 논문에 대한 자세한 리뷰입니다.#Review#Time Series Forecasting#Foundation Models#Pretrained Models#Transformer#In-Context Learning#Multivariate Forecasting#Covariates#Group Attention2025년 10월 21일댓글 수 로딩 중