Course Information
Course Overview
Learn to build decision trees for applied machine learning from scratch in Python.
Decision trees are one of the hottest topics in Machine Learning. They dominate many Kaggle competitions nowadays. Empower yourself for challenges.
This course covers both fundamentals of decision tree algorithms such as CHAID, ID3, C4.5, CART, Regression Trees and its hands-on practical applications. Besides, we will mention some bagging and boosting methods such as Random Forest or Gradient Boosting to increase decision tree accuracy. Finally, we will focus on some tree based frameworks such as LightGBM, XGBoost and Chefboost.
We will create our own decision tree framework from scratch in Python. Meanwhile, step by step exercises guide you to understand concepts clearly.
This course appeals to ones who interested in Machine Learning, Data Science and Data Mining.
Course Content
- 8 section(s)
- 24 lecture(s)
- Section 1 Introduction
- Section 2 ID3 Decision Tree Algorithm
- Section 3 C4.5 Decision Tree Algorithm
- Section 4 Classification and Regression Trees (CART)
- Section 5 CHAID Decision Trees
- Section 6 Random Forest
- Section 7 Gradient Boosting Machines
- Section 8 Decision Tree Based Frameworks
What You’ll Learn
- The most common decision tree algorithms
- Understand the core idea behind decision trees
- Developing code from scratch
- Applying ML for practical problems
- Bagging and Boosting
- Random Forest, Gradient Boosting
Reviews
-
SSuhas Bhat
The course starts on the promise of exploring the contents of the 'algorithm black box', it definitely succeeds to some extent, but does not uncover it completely. The trainer spends 70% of the time to explain how to construct decision trees based on non-numerical features. Yes, this explanation is helpful in understanding the different probability measurements. But, when it comes to solving real world data, all the previous learning is discarded and the learner is told how some 'Custom Frameworks' are used to solve problems using numerical data to achieve algorithm convergence. The last mile of the course is totally unrelated to the rest of the course. Hard-coding Gradient Descents for decision trees is cumbersome perhaps, hence the trainer's choice. However, some explanation regarding the choice would have been helpful. Overall, anyone who takes this course will learn new things. It is better than many courses out there. But, it will certainly help if the smaller parts of the course fit into a larger canvas.
-
MMuhammet İkbal ELEK
That was an excellent experience to understand the decision tree algorithms and their relationship between them by making practice.
-
AAnand S
yes. but the explanation given is very poor. Would have been better if there is an overview on what he is trying to do before showing the code.
-
GGamelihle Sibanda
Whilst the content was fine a number of videos were blurred and it was not possible to see what was being referred to