Skip to content

Latest commit

 

History

History
39 lines (27 loc) · 1.49 KB

04-decision-tree-learning.md

File metadata and controls

39 lines (27 loc) · 1.49 KB

6.4 Decision tree learning algorithm

Slides

Notes

This lesson first reviews the topics learned in previous lesson about training a decision tress using sklearn, how to handle the model not generalizing due to overfitting of the data.

In this lesson, we learn about how to best split a decision tree and different classification criteria that can be used to split a tree. We dive deep using an example and splitting the tree using misclassification criteria. Additionally, different stopping criteria to break the iterative tree split criteria are discussed.

Add notes from the video (PRs are welcome)

  • structure of a decision tree: nodes & leaves
  • depth of a decision tree & levels
  • rules & conditions, thresholds
  • misclassification rate
  • impurity criteria (i.e. MSE)
  • decision trees can be used to solve regression problems
⚠️ The notes are written by the community.
If you see an error here, please create a PR with a fix.

Navigation