Allyson Ettinger

Research

My ongoing research consists of two primary threads:

Analysis and evaluation of NLP models and representations

The first thread of research leverages insights and methodologies from cognitive science, linguistics, and neuroscience in order to better understand what information is captured by natural language processing (NLP) models. The goal of this work is to increase transparency of these systems, and to improve our capacity to evaluate "language understanding" of the kind that humans possess. The longer-term goal of this work is to improve robustness and efficiency of models' acquisition of language.

Computational psycholinguistic modeling

The second thread of research employs computational cognitive modeling, drawing on psycholinguistic theories as well as computational tools from NLP, to investigate mechanisms underlying human word and sentence comprehension. To this point I have focused primarily on modeling neural and behavioral responses associated with semantic processing at the word and sentence level.

More detail on how my work bridges linguistics and computation.


Publications

Yu, L., Ettinger, A. (2020). Assessing Phrasal Representation and Composition in Transformers. Proceedings of The 2020 Conference on Empirical Methods in Natural Language Processing. [PDF]

Misra, K., Ettinger, A., Taylor Rayz, J. (2020). Exploring BERT's Sensitivity to Lexical Cues using Tests from Semantic Priming. Findings of ACL: EMNLP 2020. [PDF]

Toshniwal, S., Wiseman, S., Ettinger, A., Gimpel, K., Livescu, K. (2020). Learning to Ignore: Long Document Coreference with Bounded Memory Neural Networks. Proceedings of The 2020 Conference on Empirical Methods in Natural Language Processing. [PDF]

Klafka, J., Ettinger, A. (2020). Spying on your neighbors: Fine-grained probing of contextual embeddings for information about surrounding words. Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics. [PDF] [Probing datasets and code]

Toshniwal, S., Ettinger, A., Gimpel, K., Livescu, K. (2020). PeTra: A Sparsely Supervised Memory Model for People Tracking. Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics. [PDF] [Colab notebook]

Ettinger, A. (2020). What BERT is not: Lessons from a new suite of psycholinguistic diagnostics for language models. Transactions of the Association for Computational Linguistics. [PDF] [Diagnostic tests and code]

Ettinger, A., Elgohary, A., Phillips, C., Resnik, P. (2018). Assessing Composition in Sentence Vector Representations. Proceedings of the 27th International Conference on Computational Linguistics. [PDF] [Supplementary material]

Ettinger, A., Rao, S., Daumé III, H., Bender, E. M. (2017). Towards Linguistically Generalizable NLP Systems: A Workshop and Shared Task. Proceedings of the First Workshop on Building Linguistically Generalizable NLP Systems. [PDF]

Ettinger, A., Elgohary, A., Resnik, P. (2016). Probing for semantic evidence of composition by means of simple classification tasks. Proceedings of the First Workshop on Evaluating Vector Space Representations for NLP, ACL 2016. Recipient of Best Proposal Award. [PDF]

Ettinger, A., Feldman, N.H., Resnik, P., Phillips, C. (2016). Modeling N400 amplitude using vector space models of word representation. Proceedings of the 38th Annual Conference of the Cognitive Science Society. [PDF]

Ettinger, A., Linzen, T. (2016). Evaluating vector space models using human semantic priming results. Proceedings of the First Workshop on Evaluating Vector Space Representations for NLP, ACL 2016. [PDF]

Ettinger, A., Resnik, P., Carpuat, M. (2016). Retrofitting sense-specific word vectors using parallel text. Proceedings of NAACL HLT 2016. [PDF] [Supplementary material]

Rao, S., Ettinger, A., Daumé III, H., Resnik, P. (2015). Dialogue focus tracking for zero pronoun resolution. Proceedings of NAACL HLT 2015. [PDF]

Ettinger, A., Linzen, T., Marantz, A. (2014). The role of morphology in phoneme prediction: Evidence from MEG. Brain and Language 129, 14-23. [PDF]

Ettinger, A. & Malamud, S. (2014) Mandarin utterance-final particle ba in the conversational scoreboard. Proceedings of Sinn und Bedeutung. [PDF]

Conference Presentations

Ettinger, A., Elgohary, A., Resnik, P. (2016). Probing for semantic evidence of composition by means of simple classification tasks. Talk presented at the First Workshop on Evaluating Vector Space Representations for NLP, ACL 2016, Berlin, Germany.

Ettinger, A., Linzen, T. (2016). Evaluating vector space models using human semantic priming results. Poster presented at the First Workshop on Evaluating Vector Space Representations for NLP, ACL 2016, Berlin, Germany.

Ettinger, A., Feldman, N.H., Resnik, P., Phillips, C. (2016). Modeling N400 amplitude using vector space models of word representation. Poster presented at Annual Conference of the Cognitive Science Society, Philadelphia, PA.

Ettinger, A., Resnik, P., Carpuat, M. (2016). Retrofitting sense-specific word vectors using parallel text. Talk presented at NAACL HLT 2016, San Diego, CA.

Rao, S., Ettinger, A., Daumé III, H., Resnik, P. (2015). Dialogue focus tracking for zero pronoun resolution. Poster presented at NAACL HLT 2015, Denver, CO.

Ettinger, A. & Malamud, S. (2013). Mandarin utterance-final particle ba in the conversational scoreboard. Talk presented at Linguistic Society of America Annual Meeting, Boston, MA, and 19th International Congress of Linguists, Geneva, Switzerland.

Ettinger, A., Linzen, T., Marantz, A. (2013). The role of morphological structure in phoneme prediction: evidence from MEG. Poster presented at Cognitive Neuroscience Society Annual Meeting and CUNY Conference on Human Sentence Processing.

Ettinger, A. & Malamud, S. (2012). Mandarin utterance-final particle ba and conversational goals. Talk presented at NYU Semantics Discussion Group Meeting, New York, NY.

Ettinger, A., Moua, M.Y., Stanford, J. (2010). Linguistic construction of gender and generations in Hmong-American communities. Talk presented at Linguistic Society of America Annual Meeting, Baltimore, MD.