Course Web Page, SoSe 2018-2019
Word representations (word embeddings) based on distributional information are a key ingredient for state-of-the-art natural language processing applications. They represent similar words like ‘computer’ and ‘laptop’ as similar vectors in vector space. Composition models for distributional semantics extend the vector spaces by learning how to create representations for complex words (e.g. ‘apple tree’) and phrases (e.g. ‘black car’) from the representations of individual words. The course will cover several approaches for creating and composing distributional word representations.
To register you’ll need to complete an introductory assignment. Please register until Monday, April 29th.
Date | Materials | Discussion leader |
---|---|---|
April 23 | DS intro (slides) | Corina |
April 25 | Comp intro (slides) | Corina |
April 30 | Tomas Mikolov, Kai Chen, Greg Corrado, Jefferey Dean. 2013. Efficient Estimation of Word Representations in Vector Space | Corina |
May 2 | Jeff Mitchell and Mirella Lapata. 2010. Composition in Distributional Models of Semantics | Corina |
May 7 | (1) Kenneth Church and Patrick Hanks. 1990. Word Association Norms, Mutual Information and Lexicography | Alla |
May 9 | (2) Marco Baroni and Roberto Zamparelli. 2010. Nouns are vectors, adjectives are matrices: Representing adjective-noun constructions in semantic space | Eva |
May 14 | (3) Hinrich Schütze. 1992. Dimensions of Meaning | Nazanin |
May 16 | (4) Richard Socher, Christopher Manning and Andrew Ng. 2010. Learning Continuous Phrase Representations and Syntactic Parsing with Recursive Neural Networks | Nianheng |
May 21 | (5) Jeffrey Pennington, Richard Socher, Christopher Manning. 2014. GloVe: Global Vectors for Word Representation | Haemanth |
May 23 | (6) Tomas Mikolov, Ilya Sutskever, Kai Chen, Greg Corrado, Jeffrey Dean. 2013. Distributed Representations of Words and Phrases and their Compositionality | Pia |
May 28 | (7) Omer Levy, Yoav Goldberg, Ido Dagan. 2015. Improving Distributional Similarity with Lessons Learned from Word Embeddings | Kai |
May 30 | no course - Christi Himmelfahrt | |
June 4 | (8) Piotr Bojanowski, Edouard Grave, Armand Joulin, Tomas Mikolov. 2017. Enriching Word Vectors with Subword Information | Stanislav |
June 6 | Project discussion | Corina |
June 11 | no course - Pfingstpause | |
June 13 | no course - Pfingstpause | |
June 18 | (9) Richard Socher, Brody Huval, Christopher Manning and Andrew Ng. 2012. Semantic Compositionality through Recursive Matrix-Vector Spaces | Julia |
June 20 | no course - Fronleichnam | |
June 25 | (11) Manaal Faruqui, Jesse Dodge, Sujay Kumar Jauhar, Chris Dyer, Eduard Hovy, Noah A. Smith. 2015. Retrofitting Word Vectors to Semantic Lexicons | Van |
June 27 | (12) Rémi Lebret and Ronan Collobert. 2015. “The Sum of Its Parts”: Joint Learning of Word and Phrase Representations With Autoencoders | Himanshu |
July 2 | In-class practical assignment | |
July 4 | Karl Moritz Hermann and Phil Blunsom. 2013. The Role of Syntax in Vector Space Models of Compositional Semantics | Corina |
July 9 | In-class practical assignment - continued | |
July 11 | Corina Dima, Daniël de Kok, Neele Witte, Erhard Hinrichs. 2019. No Word is an Island: A Transformation Weighting Model for Semantic Composition | Corina |
July 16 | 10 min progress reports (Projects 5, 6, 7, 9, 11, 12, 13, KG) | |
July 18 | 10 min progress reports (Projects 8, 2, 17, 18, 19, 20, SP) | |
July 23 | In-class practical assignment | |
July 25 | In-class practical assignment |