Department of Energy, not to be confused with Department of Defense. Granted there is a lot of nuclear weapons work at DOE, but LBNL does mostly big science work up the hill from Berkeley
This is an awesome explanation of those papers! Does anyone have any cool examples of word2vec being used in a project? I'd be interested in seeing what people could make with it.
Document type classification. We wanted to predict which of these k classes a new text document was.
We trained 100-dim word vectors on all the text content we currently have, plus some 30,000 wiki articles related to the business. New content comes in, convert words to vecs, average them, and use that resulting vec as the input to a basic classifier.
For how simple that is, the method is unreasonably good. Widely applicable too.
For anyone looking for a simple javascript explorable explanation of this you can quickly download and run in a browser, I just found the following GitHub Project.
Word vectors are great. We've also written about them at length.[0] But any one interested in word vectors should also be looking at newer ways of applying neural nets to text. Specifically, convolutional nets with pooling for time are producing great results for clustering and classification.