Bridging the Semantic Gap Between Text and Table: A Case Study on NL2SQL

4citations
4
citations
#1634
in ICLR 2025
of 3827 papers
8
Top Authors
1
Data Points

Abstract

The rise of Large Language Models (LLMs) has revolutionized numerous domains, yet these models still exhibit weakness in understanding structured tabular data.Although the growing context window promises to accommodate a larger volume of table contents, it does not inherently improve the model's ability to understand the underlying structure and semantics of tabular data.To bridge the semantic gap betweenText andTable, we proposeTnT, a table-language model that features multimodal table representations to empower LLMs to effectively and efficiently abstract structure-enriched semantics from tabular data.TnTalso introduces a scalable and efficient training pipeline, featuring novel self-supervised tasks, to integrate abstract tabular knowledge into the language modality.Extensive experimental results on NL2SQL demonstrate a much better table understanding ofTnT, which achieves up to14.4higher execution accuracy compared with traditional text-based table representations.

Citation History

Jan 25, 2026
4