题目
Introduction to Machine Learning & AI - DAT-5329 - BMBAN2 In-Class Knowledge Check #2 (Remotely Proctored)
单项选择题
If your XGBoost model is overfitting, what should you do first?
选项
A.Increase n_estimators
B.Reduce max_depth
C.Reduce min_samples_split
查看解析
标准答案
Please login to view
思路分析
The question asks about the first step to take when an XGBoost model is overfitting, so we need to evaluate what each option would do to model complexity and generalization.
Option 1: Increase n_estimators. In XGBoost, adding more trees (estimators) can improve performance on training data and potentially on validation data, but it often increases model capacity an......Login to view full explanation登录即可查看完整答案
我们收录了全球超50000道考试原题与详细解析,现在登录,立即获得答案。
类似问题
Question22 Suppose that you want to build a gradient boosting with the following loss function: [math] Assuming that [math] is the actual output and [math] is the prediction for point [math] , which of the following statements is correct? Select one alternative: You train a weak learner and at each step, you estimate [math] You train a weak learner and at each step, you estimate [math] You train a weak learner and at each step, you estimate [math] You train a weak learner and at each step, you estimate [math] ResetMaximum marks: 2 Flag question undefined
When should you prefer XGBoost over a standard decision tree?
In a consumer society, many adults channel creativity into buying things
Economic stress and unpredictable times have resulted in a booming industry for self-help products
更多留学生实用工具
希望你的学习变得更简单
加入我们,立即解锁 海量真题 与 独家解析,让复习快人一步!