[논문리뷰] Routing Matters in MoE: Scaling Diffusion Transformers with Explicit Routing GuidancearXiv에 게시된 'Routing Matters in MoE: Scaling Diffusion Transformers with Explicit Routing Guidance' 논문에 대한 자세한 리뷰입니다.#Review#Mixture-of-Experts (MoE)#Diffusion Transformers (DiTs)#Routing Guidance#Semantic Specialization#Contrastive Learning#Image Generation#Flow Matching2025년 10월 29일댓글 수 로딩 중
[논문리뷰] RefAM: Attention Magnets for Zero-Shot Referral SegmentationFederico Tombari이 arXiv에 게시한 'RefAM: Attention Magnets for Zero-Shot Referral Segmentation' 논문에 대한 자세한 리뷰입니다.#Review#Zero-Shot Segmentation#Referring Segmentation#Diffusion Transformers (DiTs)#Attention Mechanisms#Attention Sinks#Stop Words#Vision-Language Models#Training-Free Methods2025년 9월 29일댓글 수 로딩 중