Publikasjonsdetaljer
Del av: Proceedings of the 21st Nordic Conference on Computational Linguistics (NoDaLiDa) (Linköping University Electronic Press, 2017)
Sider: 284–288
År: 2017
År: 2017
Lenker:
FULLTEKST: http://www.ep.liu.se/ecp/131/039/ecp17131039.pdf
ARKIV: http://hdl.handle.net/10852/65204
OMTALE: http://publications.nr.no/1509090148/Redefining-Context-Windows-for-Word-Embedding-Models_Lison.pdf
DATA: http://www.ep.liu.se/ecp/article.asp?issue=131&article=039&volume=
Distributional semantic models learn vector
representations of words through the
contexts they occur in. Although the
choice of context (which often takes the
form of a sliding window) has a direct influence
on the resulting embeddings, the
exact role of this model component is still
not fully understood. This paper presents
a systematic analysis of context windows
based on a set of four distinct hyperparameters.
We train continuous Skip-
Gram models on two English-language
corpora for various combinations of these
hyper-parameters, and evaluate them on
both lexical similarity and analogy tasks.
Notable experimental results are the positive
impact of cross-sentential contexts
and the surprisingly good performance of
right-context windows.