Language Models

Synthesizing Proteins on the Graphics Card. Protein Folding and the Limits of Critical AI Studies

Transformer architecture Protein folding Language models LLMs

Fabian Offert, Paul Kim, and Qiaoyu Cai look at Meta’s ESM-2 protein folding ’language model’, asking what kind of knowledge the transformer architecture produces?

Length is a Curse and a Blessing for Document-level Semantics

Contrastive learning Semantic shift Language models Isotropy

At EMNLP 2023, Chenghao Xiao, Yizhi Li, G Hudson, Chenghua Lin, and Noura Al Moubayed question the length generalizability of contrastive learning-based models and introduce LA(SER)3 document representation learning framework.

On Isotropy, Contextualization and Learning Dynamics of Contrastive-based Sentence Representation Learning

Contrastive learning NLP SRL Semantics Language models Isotropy Contextualization

Chenghao Xiao, Yang Long, and Noura Al Moubayed look at contrastive SRL through the lens of isotropy, contextualization and learning dynamics at ACL 2023.