Questions
2025FallB-X-CSE571-78760 期末考试 Final Exam
Single choice
哪个激活函数保持恒定的导数? Which activation function maintains constant derivatives?
Options
A.SoftMax
B.流形 Manifold
C.Sigmoid
D.ReLu
View Explanation
Verified Answer
Please login to view
Step-by-Step Analysis
题目要求在给定的选项中选择一个保持恒定导数的激活函数。先逐个分析选项的导数特性,以判断是否满足“恒定导数”的条件。
选项 SoftMax:SoftMax 不是单一输入的逐点激活函数,而是对一组输入向量的分布进行归一化的函数。它在不同输入值处的雅可比矩阵(导数)并不是常数,......Login to view full explanationLog in for full answers
We've collected over 50,000 authentic exam questions and detailed explanations from around the globe. Log in now and get instant access to the answers!
Similar Questions
What of the following are commonly used activation functions for neural networks? I Exponential function. II Rectified Linear Unit (ReLU) function. III Sigmoid function. IV Logarithm function.
假设你正在训练一个神经网络。其中一个神经元具有输入向量 [12, 35, -80, -0.4, 21]、权重 [0.1, -0.6, 0.2, -0.01, 2]、偏置 0.45 和 ReLu 激活函数。以下哪一个是 ReLu 激活函数的输出值? Suppose you are training a neural network. One of the neurons has an input vector [12, 35, -80, -0.4, 21], weights [0.1, -0.6, 0.2, -0.01, 2], bias 0.45, and a ReLu activation function. Which value is the output of the ReLu activation function?
Which of these plots represents a ReLU activation function?
Regarding the role of activation functions in neural networks, which of the following statements is correct? I Activation functions introduce non-linearity, allowing neural networks to model complex relationships in data. II Without activation functions, a neural network with one hidden layer can only discover linear functions of the inputs. III Activation functions are applied component-wise to all neurons of a layer. IV An activation function is always required at the output layer level.
More Practical Tools for Students Powered by AI Study Helper
Making Your Study Simpler
Join us and instantly unlock extensive past papers & exclusive solutions to get a head start on your studies!