"natural language processing" Papers
12 papers found
AdaptDel: Adaptable Deletion Rate Randomized Smoothing for Certified Robustness
Zhuoqun Huang, Neil Marchant, Olga Ohrimenko et al.
NeurIPS 2025posterarXiv:2511.09316
Zero-Shot Performance Prediction for Probabilistic Scaling Laws
Viktoria Schram, Markus Hiller, Daniel Beck et al.
NeurIPS 2025posterarXiv:2510.16743
Breaking through the learning plateaus of in-context learning in Transformer
Jingwen Fu, Tao Yang, Yuwang Wang et al.
ICML 2024poster
Conformal Autoregressive Generation: Beam Search with Coverage Guarantees
Nicolas Deutschmann, Marvin Alberts, María Rodríguez Martínez
AAAI 2024paperarXiv:2309.03797
CurBench: Curriculum Learning Benchmark
Yuwei Zhou, Zirui Pan, Xin Wang et al.
ICML 2024poster
Defense against Backdoor Attack on Pre-trained Language Models via Head Pruning and Attention Normalization
Xingyi Zhao, Depeng Xu, Shuhan Yuan
ICML 2024poster
Dialogue for Prompting: A Policy-Gradient-Based Discrete Prompt Generation for Few-Shot Learning
Chengzhengxu Li, Xiaoming Liu, Yichen Wang et al.
AAAI 2024paperarXiv:2308.07272
7
citations
Few-Shot Character Understanding in Movies as an Assessment to Meta-Learning of Theory-of-Mind
Mo Yu, Qiujing Wang, Shunchi Zhang et al.
ICML 2024poster
OptiMUS: Scalable Optimization Modeling with (MI)LP Solvers and Large Language Models
Ali AhmadiTeshnizi, Wenzhi Gao, Madeleine Udell
ICML 2024poster
Removing Spurious Concepts from Neural Network Representations via Joint Subspace Estimation
Floris Holstege, Bram Wouters, Noud van Giersbergen et al.
ICML 2024poster
Revisiting Character-level Adversarial Attacks for Language Models
Elias Abad Rocamora, Yongtao Wu, Fanghui Liu et al.
ICML 2024poster
SpikeZIP-TF: Conversion is All You Need for Transformer-based SNN
kang you, Zekai Xu, Chen Nie et al.
ICML 2024poster