[논문리뷰] Splannequin: Freezing Monocular Mannequin-Challenge Footage with Dual-Detection SplattingYu-Lun Liu이 arXiv에 게시한 'Splannequin: Freezing Monocular Mannequin-Challenge Footage with Dual-Detection Splatting' 논문에 대한 자세한 리뷰입니다.#Review#Monocular 3D Reconstruction#Mannequin Challenge#Dynamic Gaussian Splatting#Freeze-Time Video#Temporal Consistency#Artifact Suppression#Regularization2025년 12월 4일댓글 수 로딩 중
[논문리뷰] On GRPO Collapse in Search-R1: The Lazy Likelihood-Displacement Death SpiralChristos Thrampoulidis이 arXiv에 게시한 'On GRPO Collapse in Search-R1: The Lazy Likelihood-Displacement Death Spiral' 논문에 대한 자세한 리뷰입니다.#Review#Reinforcement Learning (RL)#Large Language Models (LLMs)#Tool-Integrated Reasoning (TIR)#GRPO#Training Stability#Lazy Likelihood Displacement (LLD)#Regularization#Search-R12025년 12월 4일댓글 수 로딩 중
[논문리뷰] Decoupled DMD: CFG Augmentation as the Spear, Distribution Matching as the ShieldarXiv에 게시된 'Decoupled DMD: CFG Augmentation as the Spear, Distribution Matching as the Shield' 논문에 대한 자세한 리뷰입니다.#Review#Diffusion Models#Model Distillation#Classifier-Free Guidance (CFG)#Distribution Matching#Text-to-Image Generation#Few-step Generation#Regularization#Score-based Models2025년 11월 30일댓글 수 로딩 중
[논문리뷰] Frequency-Adaptive Sharpness Regularization for Improving 3D Gaussian Splatting GeneralizationYoungjung Uh이 arXiv에 게시한 'Frequency-Adaptive Sharpness Regularization for Improving 3D Gaussian Splatting Generalization' 논문에 대한 자세한 리뷰입니다.#Review#3D Gaussian Splatting#Generalization#Sharpness-Aware Minimization#Regularization#Novel View Synthesis#Sparse View Reconstruction#Loss Landscape#Frequency-Adaptive2025년 11월 26일댓글 수 로딩 중
[논문리뷰] Visual Representation Alignment for Multimodal Large Language ModelsHeeseong Shin이 arXiv에 게시한 'Visual Representation Alignment for Multimodal Large Language Models' 논문에 대한 자세한 리뷰입니다.#Review#Multimodal LLMs#Visual Representation Alignment#Foundation Models#Regularization#Fine-grained Visual Understanding#Spatial Reasoning#Object Counting#Vision-Language Models2025년 9월 10일댓글 수 로딩 중
[논문리뷰] Metis: Training Large Language Models with Advanced Low-Bit QuantizationHengjie Cao이 arXiv에 게시한 'Metis: Training Large Language Models with Advanced Low-Bit Quantization' 논문에 대한 자세한 리뷰입니다.#Review#Low-Bit Quantization#LLMs#Spectral Decomposition#Anisotropy#Adaptive Learning Rate#Regularization#FP8 Training#FP4 Training2025년 9월 3일댓글 수 로딩 중
[논문리뷰] DeCRED: Decoder-Centric Regularization for Encoder-Decoder Based Speech RecognitionLukáš Burget이 arXiv에 게시한 'DeCRED: Decoder-Centric Regularization for Encoder-Decoder Based Speech Recognition' 논문에 대한 자세한 리뷰입니다.#Review#Speech Recognition#Encoder-Decoder#Regularization#Decoder-Centric#Intermediate Supervision#Out-of-Domain Generalization#Internal Language Model2025년 8월 13일댓글 수 로딩 중