Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Course
CS-456: Deep reinforcement learning
Graph Chatbot
Lectures in this course (96)
The statistical view: generative models
Explores interpreting Neural Network output as a probability through statistical generative models.
The Likelihood of Data under a Model
Explores the likelihood of data under a model and the concept of maximum likelihood.
Interpreting Output as Probability
Delves into interpreting neural network output as probabilities based on the cross-entropy error function.
Statistical Interpretation of Artificial Neural Networks
Delves into the statistical interpretation of artificial neural networks, exploring the likelihood of data and maximizing model accuracy.
Sigmoidal Units: Natural Output Functions
Delves into using sigmoidal units as natural output functions in deep learning, focusing on statistical interpretation and optimal derivation.
Multi-class Problems: One-hot Coding
Explores one-hot coding for exclusive classes and the cross-entropy loss function in multi-class problems.
Statistical Approach: Summary and Quiz
Discusses statistical view of neural networks, classification tasks, and cross-entropy loss functions.
Inductive Bias in Machine Learning
Explores the concept of inductive bias in machine learning, emphasizing the role of prior knowledge in designing effective neural networks.
Convolutional filters as inductive bias for images
Delves into convolutional filters as an inductive bias for images in neural networks, emphasizing independence to translation and local feature detectors.
MaxPooling as inductive bias for images
Explores how MaxPooling enforces an inductive bias towards local translation invariance in convolutional neural networks.
Convolutional Layers: Quiz
Presents a quiz on convolutional layers, exploring weights, invariance, and pooling methods.
Convolutional Layer: The Gradient
Explores the optimization of filters in a convolutional layer and the backpropagation process for MaxPooling.
Automatic Differentiation: BackProp revisited
Discusses automatic differentiation, emphasizing reverse mode differentiation for optimizing convolutional layer filters by gradient descent.
Reducing the number of parameters: Outer-product representation
Covers reducing parameters in convolutional filters through outer-product representation.
Modern Convolutional Networks and Image Recognition
Explores the evolution of deep convolutional networks and their impact on image recognition accuracy.
Convolutional Networks: Applications beyond Object Recognition
Delves into the applications of convolutional networks beyond object recognition, emphasizing their impact on neuroscience, brain sciences, and art.
Previous
Page 5 of 5
Next