题目
多项选择题
Question at position 7 What activation functions are commonly used in hidden layers of neural networks? (Select all possible answers)SigmoidReLUSoftmaxTanhLinearUnit Step
选项
A.Sigmoid
B.ReLU
C.Softmax
D.Tanh
E.Linear
F.Unit Step
查看解析
标准答案
Please login to view
思路分析
To tackle this question, I will examine each option in the given list of possible activation functions and assess whether it is commonly used in the hidden layers of neural networks.
Option: Sigmoid
Reasoning: The sigmoid function maps inputs to the (0,1) range and has historically been used in hidden layers, especially in earlier networks. It can introduce nonlinearity, but it has drawbacks like vanishing gradients for deep networks. Nevertheless, it is indeed a commonly cited activation function for hidden layers in some a......Login to view full explanation登录即可查看完整答案
我们收录了全球超50000道考试原题与详细解析,现在登录,立即获得答案。
类似问题
What of the following are commonly used activation functions for neural networks? I Exponential function. II Rectified Linear Unit (ReLU) function. III Sigmoid function. IV Logarithm function.
假设你正在训练一个神经网络。其中一个神经元具有输入向量 [12, 35, -80, -0.4, 21]、权重 [0.1, -0.6, 0.2, -0.01, 2]、偏置 0.45 和 ReLu 激活函数。以下哪一个是 ReLu 激活函数的输出值? Suppose you are training a neural network. One of the neurons has an input vector [12, 35, -80, -0.4, 21], weights [0.1, -0.6, 0.2, -0.01, 2], bias 0.45, and a ReLu activation function. Which value is the output of the ReLu activation function?
哪个激活函数保持恒定的导数? Which activation function maintains constant derivatives?
Which of these plots represents a ReLU activation function?
更多留学生实用工具
希望你的学习变得更简单
加入我们,立即解锁 海量真题 与 独家解析,让复习快人一步!