题目
Introduction to Machine Learning & AI - DAT-5329 - BMBAN2 In-Class Knowledge Check #2 (Remotely Proctored)
单项选择题
When should you prefer XGBoost over a standard decision tree?
选项
A.When the dataset is very small
B.When interpretability is the primary concern
C.When the dataset is unstructured (e.g., images or text)
D.When the dataset has many features and requires high accuracy
查看解析
标准答案
Please login to view
思路分析
In evaluating when to prefer XGBoost over a standard decision tree, we should consider the strengths and limitations of each method.
Option 1: 'When the dataset is very small' — While a simple decision tree can work on small datasets, XGBoost is a more complex ensemble method that can overfit with very limited data if not carefully regularized. For tiny datasets, a single tree or simpler models often su......Login to view full explanation登录即可查看完整答案
我们收录了全球超50000道考试原题与详细解析,现在登录,立即获得答案。
类似问题
Question22 Suppose that you want to build a gradient boosting with the following loss function: [math] Assuming that [math] is the actual output and [math] is the prediction for point [math] , which of the following statements is correct? Select one alternative: You train a weak learner and at each step, you estimate [math] You train a weak learner and at each step, you estimate [math] You train a weak learner and at each step, you estimate [math] You train a weak learner and at each step, you estimate [math] ResetMaximum marks: 2 Flag question undefined
If your XGBoost model is overfitting, what should you do first?
In a consumer society, many adults channel creativity into buying things
Economic stress and unpredictable times have resulted in a booming industry for self-help products
更多留学生实用工具
希望你的学习变得更简单
加入我们,立即解锁 海量真题 与 独家解析,让复习快人一步!