"llm hallucinations" Papers
2 papers found
Fictitious Synthetic Data Can Improve LLM Factuality via Prerequisite Learning
Yujian Liu, Shiyu Chang, Tommi Jaakkola et al.
ICLR 2025posterarXiv:2410.19290
5
citations
SECA: Semantically Equivalent and Coherent Attacks for Eliciting LLM Hallucinations
Buyun Liang, Liangzu Peng, Jinqi Luo et al.
NeurIPS 2025posterarXiv:2510.04398