Compositional matrix-space models of language: Definitions, properties, and learning methods

Aus International Center for Computational Logic
Wechseln zu:Navigation, Suche

Toggle side column

Compositional matrix-space models of language: Definitions, properties, and learning methods

Shima AsaadiShima Asaadi,  Eugenie GiesbrechtEugenie Giesbrecht,  Sebastian RudolphSebastian Rudolph
Shima Asaadi, Eugenie Giesbrecht, Sebastian Rudolph
Compositional matrix-space models of language: Definitions, properties, and learning methods
Natural Language Engineering, 29(1):1-49, January 2023
  • KurzfassungAbstract
    We give an in-depth account of compositional matrix-space models (CMSMs), a type of generic models for natural language, wherein compositionality is realized via matrix multiplication. We argue for the structural plausibility of this model and show that it is able to cover and combine various common compositional natural language processing approaches. Then, we consider efficient task-specific learning methods for training CMSMs and evaluate their performance in compositionality prediction and sentiment analysis.
  • Weitere Informationen unter:Further Information: Link
  • Projekt:Project: QuantLAScaDS.AI
  • Forschungsgruppe:Research Group: Computational LogicComputational Logic
@article{AGR2023,
  author    = {Shima Asaadi and Eugenie Giesbrecht and Sebastian Rudolph},
  title     = {Compositional matrix-space models of language: Definitions,
               properties, and learning methods},
  journal   = {Natural Language Engineering},
  volume    = {29},
  number    = {1},
  publisher = {Cambridge University Press},
  year      = {2023},
  month     = {January},
  pages     = {1-49},
  doi       = {10.1017/S1351324921000206}
}