Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Word Embeddings: Modeling Word Context and Similarity
Graph Chatbot
Related lectures (29)
Optimization without Constraints: Gradient Method
Covers optimization without constraints using the gradient method to find the function's minimum.
Optimization Methods: Theory Discussion
Explores optimization methods, including unconstrained problems, linear programming, and heuristic approaches.
Deep Learning for NLP
Introduces deep learning concepts for NLP, covering word embeddings, RNNs, and Transformers, emphasizing self-attention and multi-headed attention.
Proximal Gradient Descent: Optimization Techniques in Machine Learning
Discusses proximal gradient descent and its applications in optimizing machine learning algorithms.
Deep Learning for NLP
Delves into Deep Learning for Natural Language Processing, exploring Neural Word Embeddings, Recurrent Neural Networks, and Attentive Neural Modeling with Transformers.
Gradient Descent
Covers the concept of gradient descent, a universal algorithm used to find the minimum of a function.
Optimization with Constraints: KKT Conditions
Covers the KKT conditions for optimization with constraints, essential for solving constrained optimization problems efficiently.
Latent Semantic Indexing
Covers Latent Semantic Indexing, word embeddings, and the skipgram model with negative sampling.
Word Embeddings: Glove and Semantic Relationships
Explores word embeddings, Glove model, semantic relationships, subword embeddings, and syntactic relationships.
Previous
Page 2 of 2
Next