[논문리뷰] Scalable Power Sampling: Unlocking Efficient, Training-Free Reasoning for LLMs via Distribution SharpeningHaitham Bou Ammar이 arXiv에 게시한 'Scalable Power Sampling: Unlocking Efficient, Training-Free Reasoning for LLMs via Distribution Sharpening' 논문에 대한 자세한 리뷰입니다.#Review#LLM Reasoning#Distribution Sharpening#Power Sampling#Training-Free#Monte Carlo Estimation#Jackknife Correction#Autoregressive Generation#Inference Efficiency2026년 1월 29일댓글 수 로딩 중
[논문리뷰] ReSWD: ReSTIR'd, not shaken. Combining Reservoir Sampling and Sliced Wasserstein Distance for Variance ReductionarXiv에 게시된 'ReSWD: ReSTIR'd, not shaken. Combining Reservoir Sampling and Sliced Wasserstein Distance for Variance Reduction' 논문에 대한 자세한 리뷰입니다.#Review#Sliced Wasserstein Distance#Reservoir Sampling#Variance Reduction#Distribution Matching#Diffusion Guidance#Color Correction#Monte Carlo Estimation2025년 10월 2일댓글 수 로딩 중