[논문리뷰] Higher-order Linear AttentionarXiv에 게시된 'Higher-order Linear Attention' 논문에 대한 자세한 리뷰입니다.#Review#Linear Attention#Higher-order Interactions#Causal Streaming#Associative Scans#Prefix Summaries#Transformer Architectures#State Space Models2025년 11월 9일댓글 수 로딩 중
[논문리뷰] Efficient Parallel Samplers for Recurrent-Depth Models and Their Connection to Diffusion Language ModelsarXiv에 게시된 'Efficient Parallel Samplers for Recurrent-Depth Models and Their Connection to Diffusion Language Models' 논문에 대한 자세한 리뷰입니다.#Review#Recurrent-Depth Models#Diffusion Forcing#Parallel Sampling#LLM Inference Acceleration#Transformer Architectures#Generative AI#Latent Space Diffusion2025년 10월 17일댓글 수 로딩 중
[논문리뷰] Thinking While Listening: Simple Test Time Scaling For Audio ClassificationMert Pilanci이 arXiv에 게시한 'Thinking While Listening: Simple Test Time Scaling For Audio Classification' 논문에 대한 자세한 리뷰입니다.#Review#Audio Classification#Test-Time Scaling#Reasoning Traces#Large Language Models (LLMs)#Transformer Architectures#Zero-shot Reasoning#Computational Efficiency2025년 9월 26일댓글 수 로딩 중
[논문리뷰] Beyond Memorization: Extending Reasoning Depth with Recurrence, Memory and Test-Time Compute ScalingDaniil Orel이 arXiv에 게시한 'Beyond Memorization: Extending Reasoning Depth with Recurrence, Memory and Test-Time Compute Scaling' 논문에 대한 자세한 리뷰입니다.#Review#Reasoning Depth#Cellular Automata#Transformer Architectures#Recurrence#Adaptive Computation Time#Chain-of-Thought#Reinforcement Learning#Generalization2025년 8월 26일댓글 수 로딩 중
[논문리뷰] On the Expressiveness of Softmax Attention: A Recurrent Neural Network PerspectiveEric C. Larson이 arXiv에 게시한 'On the Expressiveness of Softmax Attention: A Recurrent Neural Network Perspective' 논문에 대한 자세한 리뷰입니다.#Review#Softmax Attention#Linear Attention#Recurrent Neural Networks (RNNs)#Taylor Series Expansion#Attention Mechanisms#Expressiveness#Transformer Architectures2025년 8월 2일댓글 수 로딩 중