题目
题目

BU.330.775.T2.FA25 Final- Requires Respondus LockDown Browser

单项选择题

Which statement is correct?  

选项
A.Stochastic Gradient Descent (SGD) computes the gradients using the whole training set to update the model parameters once.
B.Batch Gradient Descent (BGD) computes the gradients using one data point to update the models parameters once.
C.Mini-batch Gradient Descent has the most bouncing behavior compared to SGD and BGD.
D.10 training epochs mean each data point has the opportunity to update the model parameters 10 times.
查看解析

查看解析

标准答案
Please login to view
思路分析
Question restatement: Which statement is correct? Option A: 'Stochastic Gradient Descent (SGD) computes the gradients using the whole training set to update the model parameters once.' This is inaccurate. SGD updates parameters using one or a few samples at a time, not the entire training set, which is characteristic of batch methods. The description more closely matches Batch Gradient Descent in its general idea, but SGD specifica......Login to view full explanation

登录即可查看完整答案

我们收录了全球超50000道考试原题与详细解析,现在登录,立即获得答案。

类似问题

更多留学生实用工具

加入我们,立即解锁 海量真题独家解析,让复习快人一步!