Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Convergence Analysis: Stochastic Gradient Algorithms
Graph Chatbot
Related lectures (29)
Deep Learning: Data Representations and Neural Networks
Covers data representations, Bag of Words, histograms, data pre-processing, and neural networks.
Introduction to Learning by Stochastic Gradient Descent: Simple Perceptron
Covers the derivation of the stochastic gradient descent formula for a simple perceptron and explores the geometric interpretation of classification.
Deep Learning: Multilayer Perceptron and Training
Covers deep learning fundamentals, focusing on multilayer perceptrons and their training processes.
Sampling Distributions: Theory and Applications
Explores sampling distributions, estimators' properties, and statistical measures for data science applications.
Optimization Methods in Machine Learning
Explores optimization methods in machine learning, emphasizing gradients, costs, and computational efforts for efficient model training.
Linear Regression Basics
Covers the basics of linear regression in machine learning, including model training, loss functions, and evaluation metrics.
Gradient Descent: Linear Regression
Covers the concept of gradient descent for linear regression, explaining the iterative process of updating parameters.
Gradient Descent: Optimization Techniques
Explores gradient descent, loss functions, and optimization techniques in neural network training.
Feed-forward Networks
Introduces feed-forward networks, covering neural network structure, training, activation functions, and optimization, with applications in forecasting and finance.
Previous
Page 2 of 2
Next