Contrastive Learning

Length is a Curse and a Blessing for Document-level Semantics

Contrastive learning Semantic shift Language models Isotropy

At EMNLP 2023, Chenghao Xiao, Yizhi Li, G Hudson, Chenghua Lin, and Noura Al Moubayed question the length generalizability of contrastive learning-based models and introduce LA(SER)3 document representation learning framework.

On Isotropy, Contextualization and Learning Dynamics of Contrastive-based Sentence Representation Learning

Contrastive learning NLP SRL Semantics Language models Isotropy Contextualization

Chenghao Xiao, Yang Long, and Noura Al Moubayed look at contrastive SRL through the lens of isotropy, contextualization and learning dynamics at ACL 2023.