New survey paper on efficient methods for natural language processing published in TACL

Our new survey paper, “Efficient Methods for Natural Language Processing: A Survey,” was published in Transactions of the Association for Computational Linguistics (TACL). This paper surveys the landscape of efficient techniques in NLP. This paper was co-authored with a large team of colleagues from IST/U. of Lisbon and Instituto de Telecomunicações, Technical University of Darmstadt, Stony Brook University, Berliner Hochschule für Technik, University of Washington, University of Southern California, The Hebrew University of Jerusalem, University of Edinburgh, Cohere For AI, University of North Carolina at Chapel Hill, Unbabel, University of Bristol, IBM Research, Allen Institute for AI, Carnegie Mellon University, and IT University of Copenhagen.

Marcos Treviso, Ji-Ung Lee, Tianchu Ji, Betty van Aken, Qingqing Cao, Manuel R. Ciosici, Michael Hassid, Kenneth Heafield, Sara Hooker, Colin Raffel, Pedro H. Martins, André F. T. Martins, Jessica Zosa Forde, Peter Milder, Edwin Simpson, Noam Slonim, Jesse Dodge, Emma Strubell, Niranjan Balasubramanian, Leon Derczynski, Iryna Gurevych, Roy Schwartz; Efficient Methods for Natural Language Processing: A Survey. Transactions of the Association for Computational Linguistics 2023; 11 826–860. https://doi.org/10.1162/tacl_a_00577

You can read the paper here.

Abstract: Recent work in natural language processing (NLP) has yielded appealing results from scaling model parameters and training data; however, using only scale to improve performance means that resource consumption also grows. Such resources include data, time, storage, or energy, all of which are naturally limited and unevenly distributed. This motivates research into efficient methods that require fewer resources to achieve similar results. This survey synthesizes and relates current methods and findings in efficient NLP. We aim to provide both guidance for conducting NLP under limited resources, and point towards promising research directions for developing more efficient methods.

 

This entry was posted on July 12, 2023.