ICML 2024 "llm alignment" Papers
2 papers found
Debating with More Persuasive LLMs Leads to More Truthful Answers
Akbir Khan, John Hughes, Dan Valentine et al.
ICML 2024posterarXiv:2402.06782
Long Is More for Alignment: A Simple but Tough-to-Beat Baseline for Instruction Fine-Tuning
Hao Zhao, Maksym Andriushchenko, Francesco Croce et al.
ICML 2024posterarXiv:2402.04833