Xiaopeng Hong
22
Papers
83
Total Citations
Papers (22)
GLAD: Towards Better Reconstruction with Global and Local Adaptive Diffusion Models for Unsupervised Anomaly Detection
ECCV 2024arXiv
57
citations
T2ICount: Enhancing Cross-modal Understanding for Zero-Shot Counting
CVPR 2025
8
citations
Reshaping the Online Data Buffering and Organizing Mechanism for Continual Test-Time Adaptation
ECCV 2024
7
citations
Specifying What You Know or Not for Multi-Label Class-Incremental Learning
AAAI 2025
5
citations
OpenSDI: Spotting Diffusion-Generated Images in the Open World
CVPR 2025
4
citations
DCA: Dividing and Conquering Amnesia in Incremental Object Detection
AAAI 2025
2
citations
Image-to-Image Translation via Hierarchical Style Disentanglement
CVPR 2021arXiv
0
citations
Boosting Crowd Counting via Multifaceted Attention
CVPR 2022arXiv
0
citations
Remote Heart Rate Measurement From Highly Compressed Facial Videos: An End-to-End Deep Learning Solution With Video Enhancement
ICCV 2019
0
citations
Structured Modeling of Joint Deep Feature and Prediction Refinement for Salient Object Detection
ICCV 2019
0
citations
Universal Perturbation Attack Against Image Retrieval
ICCV 2019
0
citations
Bayesian Loss for Crowd Count Estimation With Point Supervision
ICCV 2019
0
citations
Towards a Universal Model for Cross-Dataset Crowd Counting
ICCV 2021
0
citations
Aha! Adaptive History-Driven Attack for Decision-Based Black-Box Models
ICCV 2021
0
citations
Topology-Preserving Class-Incremental Learning
ECCV 2020
0
citations
Noise-Aware Fully Webly Supervised Object Detection
CVPR 2020
0
citations
Free Lunch Enhancements for Multi-modal Crowd Counting
CVPR 2025
0
citations
Few-Shot Audio-Visual Class-Incremental Learning with Temporal Prompting and Regularization
AAAI 2025
0
citations
ComprehendEdit: A Comprehensive Dataset and Evaluation Framework for Multimodal Knowledge Editing
AAAI 2025
0
citations
Gramformer: Learning Crowd Counting via Graph-Modulated Transformer
AAAI 2024
0
citations
Few-Shot Class-Incremental Learning
CVPR 2020arXiv
0
citations
S-Prompts Learning with Pre-trained Transformers: An Occam’s Razor for Domain Incremental Learning
NeurIPS 2022
0
citations