07/18/2016 – Sentiments!

For someone like me who’s always been interested in practical machine learning, it was wonderfully delightful to have found Word2vec, a neural network that intakes text and outputs numerical vectors. It transforms each word in a sentence into a series of numbers that could be used to predict the probabilities of related items, on top of mathematical interpretations of similar words. This means that this method doesn’t need to know the exact definition of the words, and with enough data could better interpret relationships between words than the average human being.

Of course, that’s not all Word2vec could do. It seems that the applicability of the Word2vec (there are 2 distinct models) goes beyond predictive syntax interpretations.

If you’re interested in learning more about the Word2vec and its intricacies, check out:

  1. This document published by an Israelian computer science PhD (with a link to the PDF).
  2. A publication by the guys at GOOG on sentence and phrase compositions.
Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s