题目
2025FallB-X-CSE571-78760 期末考试 Final Exam
单项选择题
哪个激活函数保持恒定的导数? Which activation function maintains constant derivatives?
选项
A.SoftMax
B.流形 Manifold
C.Sigmoid
D.ReLu
查看解析
标准答案
Please login to view
思路分析
题目要求在给定的选项中选择一个保持恒定导数的激活函数。先逐个分析选项的导数特性,以判断是否满足“恒定导数”的条件。
选项 SoftMax:SoftMax 不是单一输入的逐点激活函数,而是对一组输入向量的分布进行归一化的函数。它在不同输入值处的雅可比矩阵(导数)并不是常数,......Login to view full explanation登录即可查看完整答案
我们收录了全球超50000道考试原题与详细解析,现在登录,立即获得答案。
类似问题
What of the following are commonly used activation functions for neural networks? I Exponential function. II Rectified Linear Unit (ReLU) function. III Sigmoid function. IV Logarithm function.
假设你正在训练一个神经网络。其中一个神经元具有输入向量 [12, 35, -80, -0.4, 21]、权重 [0.1, -0.6, 0.2, -0.01, 2]、偏置 0.45 和 ReLu 激活函数。以下哪一个是 ReLu 激活函数的输出值? Suppose you are training a neural network. One of the neurons has an input vector [12, 35, -80, -0.4, 21], weights [0.1, -0.6, 0.2, -0.01, 2], bias 0.45, and a ReLu activation function. Which value is the output of the ReLu activation function?
Which of these plots represents a ReLU activation function?
Regarding the role of activation functions in neural networks, which of the following statements is correct? I Activation functions introduce non-linearity, allowing neural networks to model complex relationships in data. II Without activation functions, a neural network with one hidden layer can only discover linear functions of the inputs. III Activation functions are applied component-wise to all neurons of a layer. IV An activation function is always required at the output layer level.
更多留学生实用工具
希望你的学习变得更简单
加入我们,立即解锁 海量真题 与 独家解析,让复习快人一步!