Skip to content

YiWenLai510/Machine-Learning-Foundations-Techniques-Fall-2020

Repository files navigation

Machine Learning Foundations/Techniques (NTU, Fall 2020)

  • Course website
  • Abstract of HWs
    • hw1
      • The Learning Problem
      • Perceptron Learning Algorithm (PLA)
      • Off-Training-Set Error
      • Hoeffding Inequality
      • Bad Data
      • Multiple-Bin Sampling
      • Experiments with Perceptron Learning Algorithm
    • hw2
      • Perceptrons
      • Ring Hypothesis Set
      • Deviation from Optimal Hypothesis
      • The VC Dimension
      • Noise and Error
      • Decision Stump
    • hw3
      • Linear Regression
      • Likelihood and Maximum Likelihood
      • Gradient and Stochastic Gradient Descent
      • Hessian and Newton Method
      • Multinomial Logistic Regression
      • Nonlinear Transformation
      • Experiments with Linear and Nonlinear Models
    • hw4
      • Deterministic Noise
      • Learning Curve
      • Noisy Virtual Examples
      • Regularization
      • Leave-one-out
      • Learning Principles
      • Experiments with Regularized Logistic Regression
    • hw5
      • Hard-Margin SVM and Large Margin
      • Dual Problem of Quadratic Programming
      • Properties of Kernels
      • Kernel Perceptron Learning Algorithm
      • Soft-Margin SVM
      • Experiments with Soft-Margin SVM
    • hw6
      • Neural Networks
      • Matrix Factorization
      • Aggregation
      • Adaptive Boosting
      • Decision Tree
      • Experiments with Decision Tree and Random Forest

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published