Learning Word Representation in Compositional Matrix-Space Models

Aus International Center for Computational Logic
Wechseln zu:Navigation, Suche

Learning Word Representation in Compositional Matrix-Space Models

Vortrag von Shima Asaadi
Learning word representations to capture the semantics and compositionality of language has received much research interest in natural language processing. Beyond the popular vector space models for word representations and compositionality, Compositional Matrix-Space Models (CMSMs) have been proposed. In this talk, I introduce the principle idea of CMSM and its application in NLP tasks. Then, I address the problem of learning matrix representation of words in CMSMs for the task of fine-grained sentiment analysis. I show that our approach, which learns the word matrices gradually in two steps, outperforms other approaches in CMSMs in terms of quality and computational cost.