Аннотация:The recently proposed Skip-gram model is a
powerful method for learning high-dimensional
word representations that capture rich semantic
relationships between words. However, Skip-
gram as well as most prior work on learning word
representations does not take into account word
ambiguity and maintain only a single representa-
tion per word. Although a number of Skip-gram
modifications were proposed to overcome this
limitation and learn multi-prototype word repre-
sentations, they either require a known number
of word meanings or learn them using greedy
heuristic approaches. In this paper we propose
the Adaptive Skip-gram model which is a non-
parametric Bayesian extension of Skip-gram ca-
pable to automatically learn the required num-
ber of representations for all words at desired
semantic resolution. We derive efficient online
variational learning algorithm for the model and
empirically demonstrate its efficiency on word-
sense induction task.