word2vec part 2: graph building and training
Posted on Tue 03 April 2018 in blog • Tagged with python, machine learning, tensorflow, nlp, prediction, word2vec
In the last post we built the data preprocessing required for word2vec training. In this post we will build the network and perform the training on the text8 dataset (source), a Wikipedia dump of ~17 million tokens.
Note that we are implementing the skip-gram version of word2vec since it has superior performance
Continue reading
word2vec part 1: exploration and defining data flow
Posted on Mon 02 April 2018 in blog • Tagged with python, machine learning, tensorflow, neural networds, nlp
One of the trends I've been seeing is the use of embeddings for similarity/recommendation in non-NLP domains (this talk and many others). While I've used/thought about word2vec for a while, I wanted to implement this to really get a sense of how it works and ways it can be extended to other domains. Intuitively, word2vec makes sense, and there are a lot of packages that will let you compute it without thinking much about the implementation (e.g. gensim
Continue reading
Exploring neural networks for text classification
Posted on Fri 17 November 2017 in blog • Tagged with python, machine learning, keras, nlp, deep learning, classification
I've been working on text classification recently. I've found keras to be a quite good high-level language and great for learning different neural network architectures. In this notebook I will examine Tweet classification using CNN and LSTM model architechtures. While CNNs are widely used in Computer Vision, I saw a paper
Continue reading