题目
25S-STATS-102B-LEC-3 S25 Midterm Exam- Requires Respondus LockDown Browser
单项选择题
Please select the incorrect statements about k-fold cross-validation and Leave-One-Out Cross-Validation (LOOCV):
选项
A.Increasing k (closer to the number of observations) typically reduces the variance of the estimate but increases the bias.
B.LOOCV can be computationally expensive for large datasets since it requires fitting the model n times.
C.In k-fold cross-validation, using a smaller k generally leads to a lower computational cost.
D.LOOCV is a special case of k-fold cross-validation where k equals the number of observations.
查看解析
标准答案
Please login to view
思路分析
When considering statements about k-fold cross-validation and LOOCV, it helps to recall core properties of how these methods behave with respect to bias, variance, and computational cost.
Option 1: 'Increasing k (closer to the number of observations) typically reduces the variance of the estimate but increases the bias.' This is incorrect. In......Login to view full explanation登录即可查看完整答案
我们收录了全球超50000道考试原题与详细解析,现在登录,立即获得答案。
类似问题
In a 75%/25% Cross-Validation
Assuming doing a 10-fold cross validation for linear regression (Y = AX + B). It is possible that 10 different As and Bs would be generated, one for each model learned.
Cross-validation is a special case of the validation set approach.
Suppose we are going to perform K-fold cross-validation. If we make 5 folds from the data set that has 100 observations. Which of the following are true statements?
更多留学生实用工具
希望你的学习变得更简单
加入我们,立即解锁 海量真题 与 独家解析,让复习快人一步!