题目
2025FallB-X-CSE571-78760 模块 1: 机器学习简介知识检查 Module 1: Intro to Machine Learning Knowledge Check
单项选择题
在梯度下降中如何更新参数?How do we update the parameters in gradient descent?
选项
A.在每个方向上对步长采样,并选取下降梯度最陡的反方向的步长By sampling steps in each direction and taking one negative to the one with steepest descent
B.朝最陡下降方向的反方向取步长 By taking a step in the negative direction of steepest descent
C.朝最陡下降方向取步长 By taking a step in the direction of the steepest descent
D.在每个方向上对步长采样,并选取下降梯度最陡的步长By sampling steps in each direction and taking one with the steepest descent
查看解析
标准答案
Please login to view
思路分析
在梯度下降中更新参数的核心原则是沿着梯度的负方向移动,以减少目标函数的值。下面逐条分析各选项,以揭示它们的正确性与错误之处。
选项1: '在每个方向上对步长采样,并选取下降梯度最陡的反方向的步长' 这个说法听起来像是在多方向上做步长搜索,并挑选在某个方向上的负梯度步长。然而,标准的梯度下降并不需要在每个......Login to view full explanation登录即可查看完整答案
我们收录了全球超50000道考试原题与详细解析,现在登录,立即获得答案。
类似问题
Which of the following statements about gradient descent and learning rate is true?
Which statement is correct?
假设你正在训练一个网络,参数为 [4.5, 2.5, 1.2, 0.6],学习率为 0.2,梯度为 [-1, 9, 2, 5]。更新一个梯度下降步长后,网络的参数等于多少? Suppose that you are training a network with parameters [4.5, 2.5, 1.2, 0.6], a learning rate of 0.2, and a gradient of [-1, 9, 2, 5]. After one update step of gradient descent, what would your network's parameters be equal to?
Which of the following best describes the role of the gradient in gradient descent?
更多留学生实用工具
希望你的学习变得更简单
加入我们,立即解锁 海量真题 与 独家解析,让复习快人一步!