题目
2025FallB-X-CSE571-78760 期末考试 Final Exam
单项选择题
假设你正在训练一个神经网络。其中一个神经元具有输入向量 [12, 35, -80, -0.4, 21]、权重 [0.1, -0.6, 0.2, -0.01, 2]、偏置 0.45 和 ReLu 激活函数。以下哪一个是 ReLu 激活函数的输出值? Suppose you are training a neural network. One of the neurons has an input vector [12, 35, -80, -0.4, 21], weights [0.1, -0.6, 0.2, -0.01, 2], bias 0.45, and a ReLu activation function. Which value is the output of the ReLu activation function?
选项
A.-0.54
B.0
C.1
D.6.654
查看解析
标准答案
Please login to view
思路分析
在这个问题中,我们需要先计算神经元的线性组合再应用 ReLU 激活。
第一步,计算加权和:对每个输入与对应权重相乘后求和。
12 × 0.1 = 1.2;
35 × (-0.6) =......Login to view full explanation登录即可查看完整答案
我们收录了全球超50000道考试原题与详细解析,现在登录,立即获得答案。
类似问题
What of the following are commonly used activation functions for neural networks? I Exponential function. II Rectified Linear Unit (ReLU) function. III Sigmoid function. IV Logarithm function.
哪个激活函数保持恒定的导数? Which activation function maintains constant derivatives?
Which of these plots represents a ReLU activation function?
Regarding the role of activation functions in neural networks, which of the following statements is correct? I Activation functions introduce non-linearity, allowing neural networks to model complex relationships in data. II Without activation functions, a neural network with one hidden layer can only discover linear functions of the inputs. III Activation functions are applied component-wise to all neurons of a layer. IV An activation function is always required at the output layer level.
更多留学生实用工具
希望你的学习变得更简单
加入我们,立即解锁 海量真题 与 独家解析,让复习快人一步!