This week, you will learn about decision trees and ensemble techniques that often built on tree algorithms to produce more robust machine learning predictions. The two main ensemble approaches we will cover are bagging and boosting, which differ in how weak learners are created and combined. The random forest algorithm is an example of a bagging algorithm, which is extremely popular due to its flexibility and ease of use. Likewise, the gradient boosted tree algorithm (or the similar Adaboost algorithm) is an example of a popular boosting ensemble technique. This week, you will learn how to create decision trees, and how to leverage them to build bagging and boosted ensemble learners.
- Understand the decision tree algorithm
- Understand the basic concept behind ensemble techniques
- Know the difference between bagging and boosting.
- Understand the random forest algorithm
- Understand the gradient boosted tree algorithm
Activities and Assignments | Time Estimate | Deadline* | Points |
---|---|---|---|
Week 4 Introduction Video | 10 Minutes | Tuesday | N/A |
Week 4 Lesson 1: Introduction to Decision Trees | 2 Hours | Thursday | 20 |
Week 4 Lesson 2: Ensemble Techniques: Bagging | 2 Hours | Thursday | 20 |
Week 4 Lesson 3: Ensemble Techniques: Boosting | 2 Hours | Thursday | 20 |
Week 4 Quiz | 45 Minutes | Friday | 70 |
Week 4 Assignment Submission | 4 Hours | The following Monday | 125 Instructor, 10 Peers |
Week 4 Completion of Peer Review | 2 Hours | The following Saturday | 15 |
Please note that unless otherwise noted, the due time is 6pm Central time!