题目
单项选择题
Refer to the feature importance graph below. If you want to simplify the model with minimal loss in predictive power, which features could you REMOVE so that the remaining features still account for at least 80% of the total importance? [2 marks]
查看解析
标准答案
Please login to view
思路分析
The task presents a question about feature importance and removing features to keep at least 80% of the total importance, but there is no list of answer options to analyze.
First, restating the core prompt: the question asks which features could be removed so that the remaining features still account for at least 80% of the total importance, based on a feature importance graph, with the goal of simplifying the model with minimal loss in predictive power.
Because the answer_options array is empty, we cannot evaluate multiple candidate choices or compare their me......Login to view full explanation登录即可查看完整答案
我们收录了全球超50000道考试原题与详细解析,现在登录,立即获得答案。
类似问题
If a feature has the same value for all records, then:
Question at position 12 Consider the following summary of an October 2019 article published in Science by Obermeyer et al. (2019): "The U.S. health care system uses commercial algorithms to guide health decisions. Obermeyer et al. find evidence of racial bias in one widely used algorithm, such that Black patients assigned the same level of risk by the algorithm are sicker than White patients (see the Perspective by Benjamin). The authors estimated that this racial bias reduces the number of Black patients identified for extra care by more than half. Bias occurs because the algorithm uses health costs as a proxy for health needs. Less money is spent on Black patients who have the same level of need, and the algorithm thus falsely concludes that Black patients are healthier than equally sick White patients. Reformulating the algorithm so that it no longer uses costs as a proxy for needs eliminates the racial bias in predicting who needs extra care." Which option below identifies the issue this case is referring to?Feature (variable) selection issueLabeling issuenone of the aboveData collection issue
Question at position 26 True/False Question: Forward selection is faster than backward selection if few features are relevant to prediction. TrueFalse
Question at position 10 Select all correct answers: Which of the following are common methods of feature selection? kNNPrincipal Component Analysis (PCA)Ridge RegressionMin Max ScalingBackward EliminationLasso RegressionForward Selection
更多留学生实用工具
希望你的学习变得更简单
加入我们,立即解锁 海量真题 与 独家解析,让复习快人一步!