2024 Poster "ai alignment" Papers
2 papers found
AI Alignment with Changing and Influenceable Reward Functions
Micah Carroll, Davis Foote, Anand Siththaranjan et al.
ICML 2024posterarXiv:2405.17713
Position: Social Choice Should Guide AI Alignment in Dealing with Diverse Human Feedback
Vincent Conitzer, Rachel Freedman, Jobstq Heitzig et al.
ICML 2024poster