Charles Explorer logo
🇬🇧

SubGram: Extending Skip-gram Word Representation with Substrings

Publication at Faculty of Mathematics and Physics |
2016

Abstract

Skip-gram (word2vec) is a recent method for creating vector representations of words ("distributed word representations") using a neural network. The representation gained popularity in various areas of natural language processing, because it seems to capture syntactic and semantic information about words without any explicit supervision in this respect.

We propose SubGram, a refinement of the Skip-gram model to consider also the word structure during the training process, achieving large gains on the Skip-gram original test set.