Ga Wu
7
Papers
47
Total Citations
Papers (7)
Representation Entanglement for Generation: Training Diffusion Transformers Is Much Easier Than You Think
NeurIPS 2025arXiv
27
citations
LayoutDETR: Detection Transformer Is a Good Multimodal Layout Designer
ECCV 2024arXiv
14
citations
GeoDynamics: A Geometric State‑Space Neural Network for Understanding Brain Dynamics on Riemannian Manifolds
NeurIPS 2025arXiv
2
citations
Data-centric Prediction Explanation via Kernelized Stein Discrepancy
ICLR 2025arXiv
2
citations
LEDiT: Your Length-Extrapolatable Diffusion Transformer without Positional Encoding
NeurIPS 2025arXiv
1
citations
Learning Robust Representations with Long-Term Information for Generalization in Visual Reinforcement Learning
ICLR 2025
1
citations
Accident Anticipation via Temporal Occurrence Prediction
NeurIPS 2025arXiv
0
citations