[논문리뷰] QuantVLA: Scale-Calibrated Post-Training Quantization for Vision-Language-Action ModelsXin Wang이 arXiv에 게시한 'QuantVLA: Scale-Calibrated Post-Training Quantization for Vision-Language-Action Models' 논문에 대한 자세한 리뷰입니다.#Review#Post-Training Quantization (PTQ)#Vision-Language-Action (VLA) Models#Diffusion Transformer (DiT)#Scale Calibration#Memory Efficiency#Robotics#Low-Bit Quantization2026년 2월 24일댓글 수 로딩 중
[논문리뷰] BPDQ: Bit-Plane Decomposition Quantization on a Variable Grid for Large Language ModelsarXiv에 게시된 'BPDQ: Bit-Plane Decomposition Quantization on a Variable Grid for Large Language Models' 논문에 대한 자세한 리뷰입니다.#Review#Quantization#Large Language Models#Post-Training Quantization#Bit-Plane Decomposition#Variable Quantization Grid#Low-Bit Quantization#Model Compression#Hessian-Induced Geometry2026년 2월 15일댓글 수 로딩 중
[논문리뷰] SignRoundV2: Closing the Performance Gap in Extremely Low-Bit Post-Training Quantization for LLMsarXiv에 게시된 'SignRoundV2: Closing the Performance Gap in Extremely Low-Bit Post-Training Quantization for LLMs' 논문에 대한 자세한 리뷰입니다.#Review#Post-Training Quantization (PTQ)#Large Language Models (LLMs)#Low-Bit Quantization#Mixed-Precision Quantization#Sensitivity Metric#Quantization Scale Initialization#Accuracy Preservation2025년 12월 4일댓글 수 로딩 중
[논문리뷰] Metis: Training Large Language Models with Advanced Low-Bit QuantizationHengjie Cao이 arXiv에 게시한 'Metis: Training Large Language Models with Advanced Low-Bit Quantization' 논문에 대한 자세한 리뷰입니다.#Review#Low-Bit Quantization#LLMs#Spectral Decomposition#Anisotropy#Adaptive Learning Rate#Regularization#FP8 Training#FP4 Training2025년 9월 3일댓글 수 로딩 중