题目
2025FallB-X-CSE571-78760 期末考试 Final Exam
单项选择题
以下哪一项*正确*描述了反向传播算法的步骤? Which steps *correctly* describe the backpropagation algorithm?
选项
A.首先,输入值通过网络向前传播,以计算输出值与目标值之差。然后将损失函数相对于权重进行微分,再根据链式法则计算每层权重的变化。 First the input values are propagated forward through the network to calculate the difference between output and target values. Then the loss function is differentiated with respect to weights, and the chain rule is then used to calculate changes to weights in each layer.
B.首先,输入值通过网络向前传播,以计算输出值与目标值之差。然后将权重相对于损失函数进行微分,以计算每层权重的变化。 First the input values are propagated forward through the network to calculate the difference between output and target values. Then the weights are differentiated with respect to the loss function to calculate changes to weights in each layer.
C.首先,输入值通过网络向前传播,以计算输出值与输入值之差。然后将损失函数相对于输入进行微分,以计算每层权重的变化。First the input values are propagated forward through the network to calculate the difference between output and input values. Then the loss function is differentiated with respect to the inputs to calculate changes to weights in each layer.
D.首先,输入值通过网络向前传播,以计算输出值与输入值之差。然后将损失函数相对于目标值进行微分,以计算每层权重的变化。First the input values are propagated forward through the network to calculate the difference between output and input values. Then the loss function is differentiated with respect to the target values to calculate changes to weights in each layer.
查看解析
标准答案
Please login to view
思路分析
在分析反向传播算法的步骤时,我们需要把前向传播和梯度计算以及权重更新的逻辑分开来看。
选项1: '首先,输入值通过网络向前传播,以计算输出值与目标值之差。然后将损失函数相对于权重进行微分,再根据链式法则计算每层权重的变化。' 这一描述基本符合反向传播的核心流程:先进行前向传播得到输出与目标的误差;再对损失函数......Login to view full explanation登录即可查看完整答案
我们收录了全球超50000道考试原题与详细解析,现在登录,立即获得答案。
类似问题
Which of the following options is the key factor that led the return of neural networks?
[Multiple choice] Which of the following options is the key factor that led the return of neural networks?
In the training process of a feedforward neural network (either shallow or deep), forward and backward passes play crucial roles. Which of the following statements about forward and backward passes are correct? I The forward pass computes the output of the network by propagating inputs through the layers. II During the backward pass, gradients are calculated and propagated from the output layer back to the input layer to update the weights. III The forward pass adjusts the network's weights and biases to minimize the loss function. IV The backward pass involves calculating partial derivatives of the loss with respect to each weight and bias in the network.
Question at position 2 What does backpropagation refer to in neural networks? Computing gradients for all parametersMapping inputs to outputsNormalizing dataUpdating weights in the forward direction
更多留学生实用工具
希望你的学习变得更简单
加入我们,立即解锁 海量真题 与 独家解析,让复习快人一步!