Automatic Extraction of Compositional Matrix-Space Models of Language

From International Center for Computational Logic

Automatic Extraction of Compositional Matrix-Space Models of Language

Talk by Shima Asaadi
Learning word representations in distributional semantic models to capture the semantics and compositionality of natural language is a central research area of computational linguistics. Compositional Matrix-Space Models (CMSMs) introduce a novel word representation alternative to Vector Space Models (VSMs). This talk presents the results of learning Compositional Matrix-Space Models to capture the semantics and compositionality in natural language processing tasks including sentiment analysis and compositionality detection of short phrases. Then, a new dataset is introduced for examining compositional distributional semantic models and present benchmark experiments on using the developed dataset as a testbed to evaluate semantic composition in distributional semantic models.