Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Model Analysis
Graph Chatbot
Related lectures (32)
Introduction to Natural Language Processing
Covers the basics of Natural Language Processing, from traditional to modern approaches, highlighting the challenges and importance of studying both methods.
Neural Networks: Two Layers Neural Network
Covers the basics of neural networks, focusing on the development from two layers neural networks to deep neural networks.
Transformer Architecture: Subquadratic Attention Mechanisms
Covers transformer architecture, focusing on encoder-decoder models and subquadratic attention mechanisms for efficient processing of input sequences.
Pre-Training: BiLSTM and Transformer
Delves into pre-training BiLSTM and Transformer models for NLP tasks, showcasing their effectiveness and applications.
Neural Networks Optimization
Explores neural networks optimization, including backpropagation, batch normalization, weight initialization, and hyperparameter search strategies.
Transformers: Unifying Machine Learning Communities
Covers the role of Transformers in unifying various machine learning fields.
Compositional Representations and Systematic Generalization
Examines systematicity, compositionality, neural network challenges, and unsupervised learning in NLP.
Neural Networks: Training and Activation
Explores neural networks, activation functions, backpropagation, and PyTorch implementation.
Introduction to Modern Natural Language Processing
Introduces the course on Modern Natural Language Processing, covering its significance, applications, challenges, and advancements in technology.
Deep Learning: Convolutional Neural Networks
Covers Convolutional Neural Networks, standard architectures, training techniques, and adversarial examples in deep learning.
Deep Generative Models: Part 2
Explores deep generative models, including mixtures of multinomials, PCA, deep autoencoders, convolutional autoencoders, and GANs.
Neural Word Embeddings: Learning Representations for Natural Language
Covers neural word embeddings and methods for learning word representations in natural language processing.
Previous
Page 2 of 2
Next