Questions
Introduction to Machine Learning & AI - DAT-5329 - BMBAN2 In-Class Knowledge Check #2 (Remotely Proctored)
Single choice
If your XGBoost model is overfitting, what should you do first?
Options
A.Increase n_estimators
B.Reduce max_depth
C.Reduce min_samples_split
View Explanation
Verified Answer
Please login to view
Step-by-Step Analysis
The question asks about the first step to take when an XGBoost model is overfitting, so we need to evaluate what each option would do to model complexity and generalization.
Option 1: Increase n_estimators. In XGBoost, adding more trees (estimators) can improve performance on training data and potentially on validation data, but it often increases model capacity an......Login to view full explanationLog in for full answers
We've collected over 50,000 authentic exam questions and detailed explanations from around the globe. Log in now and get instant access to the answers!
Similar Questions
Question22 Suppose that you want to build a gradient boosting with the following loss function: [math] Assuming that [math] is the actual output and [math] is the prediction for point [math] , which of the following statements is correct? Select one alternative: You train a weak learner and at each step, you estimate [math] You train a weak learner and at each step, you estimate [math] You train a weak learner and at each step, you estimate [math] You train a weak learner and at each step, you estimate [math] ResetMaximum marks: 2 Flag question undefined
When should you prefer XGBoost over a standard decision tree?
In a consumer society, many adults channel creativity into buying things
Economic stress and unpredictable times have resulted in a booming industry for self-help products
More Practical Tools for Students Powered by AI Study Helper
Making Your Study Simpler
Join us and instantly unlock extensive past papers & exclusive solutions to get a head start on your studies!