Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Kernel Ridge Regression: Equivalent Formulations and Representer Theorem
Graph Chatbot
Related lectures (30)
Support Vector Machines: Theory and Applications
Explores Support Vector Machines theory, parameters, uniqueness, and applications in machine learning.
Linear Algebra in 3D: Images and Kernels
Explores linear applications in 3D, emphasizing images, kernels, and solution uniqueness in systems.
Support Vector Machines
Introduces Support Vector Machines, covering Hinge Loss, hyperplane separation, and non-linear classification using kernels.
Introduction to Machine Learning: Linear Models
Introduces linear models for supervised learning, covering overfitting, regularization, and kernels, with applications in machine learning tasks.
Feature Expansion: Kernels and KNN
Covers feature expansion, kernels, and K-nearest neighbors, including non-linearity, SVM, and Gaussian kernels.
Linear Algebra: Systems and Subspaces
Covers linear systems, vector subspaces, and the kernel and image of linear applications.
Kernel Trick: Understanding Machine Learning
Explores the kernel trick in machine learning, enabling high-dimensional operations without explicit coordinate calculations.
Kernel Regression
Covers the concept of kernel regression and making data linearly separable by adding features and using local methods.
Support Vector Machine Overview
Gives an overview of Support Vector Machines, comparing advantages and disadvantages of SVM with other classifiers.
Kernel Methods in Machine Learning: Kernel Regression and SVM
Discusses kernel methods in machine learning, focusing on kernel regression and support vector machines, including their formulations and applications.
Previous
Page 2 of 2
Next