题目
AMATH 482 A Checkpoint 4 quiz
判断题
In solving multi-class classification problems with 𝑛 classes using a neural network the output of the network will be output of 𝑛 neurons (logits) which correspond to probabilities of the output to belong to the corresponding classes (one-hot encoding). To compute the loss (cross-entropy loss) the output of neurons is evaluated by one of nonlinear activation functions tanh, sigmoid, ReLU, softmax before the loss is computed.
查看解析
标准答案
Please login to view
思路分析
Begin by unpacking the statement and the standard approach to multiclass classification with neural networks.
Option analysis:
Option: The statement claims that for n-class classification the network outputs are n neurons (logits) that correspond to probabilities of belonging to each class (one-hot encoding).
- This is partially misleading: those n outputs are typically called logits, which are unnormalized scores, not probabilities. Probabilities are obtained by applying a softmax (or an equivalent) to the logits, an......Login to view full explanation登录即可查看完整答案
我们收录了全球超50000道考试原题与详细解析,现在登录,立即获得答案。
类似问题
What of the following are commonly used activation functions for neural networks? I Exponential function. II Rectified Linear Unit (ReLU) function. III Sigmoid function. IV Logarithm function.
假设你正在训练一个神经网络。其中一个神经元具有输入向量 [12, 35, -80, -0.4, 21]、权重 [0.1, -0.6, 0.2, -0.01, 2]、偏置 0.45 和 ReLu 激活函数。以下哪一个是 ReLu 激活函数的输出值? Suppose you are training a neural network. One of the neurons has an input vector [12, 35, -80, -0.4, 21], weights [0.1, -0.6, 0.2, -0.01, 2], bias 0.45, and a ReLu activation function. Which value is the output of the ReLu activation function?
哪个激活函数保持恒定的导数? Which activation function maintains constant derivatives?
Which of these plots represents a ReLU activation function?
更多留学生实用工具
希望你的学习变得更简单
加入我们,立即解锁 海量真题 与 独家解析,让复习快人一步!