Explores word embeddings, models like CBOW and Skipgram, Fasttext, Glove, subword embeddings, and their applications in document search and classification.
Explores computing density of states and Bayesian inference using importance sampling, showcasing lower variance and parallelizability of the proposed method.
Explores optimizing word embedding models, including loss function minimization and gradient descent, and introduces techniques like Fasttext and Byte Pair Encoding.