Learning Word Representation in Compositional Matrix-Space Models
From International Center for Computational Logic
Learning Word Representation in Compositional Matrix-Space Models
Talk by Shima Asaadi
- Location: APB 3027
- Start: 12. April 2018 at 1:00 pm
- End: 12. April 2018 at 2:30 pm
- Research group: Computational Logic
- Event series: KBS Seminar
- iCal
Learning word representations to capture the semantics and compositionality of language has received much research interest in natural language processing. Beyond the popular vector space models for word representations and compositionality, Compositional Matrix-Space Models (CMSMs) have been proposed. In this talk, I introduce the principle idea of CMSM and its application in NLP tasks. Then, I address the problem of learning matrix representation of words in CMSMs for the task of fine-grained sentiment analysis. I show that our approach, which learns the word matrices gradually in two steps, outperforms other approaches in CMSMs in terms of quality and computational cost.