WordWeb is designed to visualize word vectors of multiple languages. For the last five years, deep learning community has developed effective unsupervised techniques in word representation. In 2016, Facebook introduced yet another one, they call fastText1), and subsequently generously published pre-trained word vectors for 294 languages [here]. The mission of this simple web page is to visualize those word vectors not only for researchers but also the public. You can see which words are related to a certain word in any language in the 2-D projected space. For that we used t-SNE2), a famous dimensionality reduction algorithm.
We thank Facebook AI Research for kindly releasing the fastText library under the BSD-3-clause. The visualization web interface is Kakao Brain’s software built on top of fastText.
1) P. Bojanowski*, E. Grave*, A. Joulin, T. Mikolov, Enriching Word Vectors with Subword Information
2) L.J.P. van der Maaten. Accelerating t-SNE using Tree-Based Algorithms. Journal of Machine Learning Research 15(Oct):3221-3245, 2014.
Open Sources