Questions
Learning AI Through Visualization 4 Module 4 Quiz
Single choice
Which of the following is an example of a commonly used activation function in deep learning?
Options
A.Linear function
B.Sigmoid function
C.Polynomial function
View Explanation
Verified Answer
Please login to view
Step-by-Step Analysis
Question restatement: Which of the following is an example of a commonly used activation function in deep learning?
Option 1: Linear function. While a linear activation preserves linearity, stacking linear activations across layers still yields a linear model and cannot capture nonlinear patterns, limiting the network's expressive power. This makes linear activations unsuitable as the ac......Login to view full explanationLog in for full answers
We've collected over 50,000 authentic exam questions and detailed explanations from around the globe. Log in now and get instant access to the answers!
Similar Questions
What of the following are commonly used activation functions for neural networks? I Exponential function. II Rectified Linear Unit (ReLU) function. III Sigmoid function. IV Logarithm function.
假设你正在训练一个神经网络。其中一个神经元具有输入向量 [12, 35, -80, -0.4, 21]、权重 [0.1, -0.6, 0.2, -0.01, 2]、偏置 0.45 和 ReLu 激活函数。以下哪一个是 ReLu 激活函数的输出值? Suppose you are training a neural network. One of the neurons has an input vector [12, 35, -80, -0.4, 21], weights [0.1, -0.6, 0.2, -0.01, 2], bias 0.45, and a ReLu activation function. Which value is the output of the ReLu activation function?
哪个激活函数保持恒定的导数? Which activation function maintains constant derivatives?
Which of these plots represents a ReLU activation function?
More Practical Tools for Students Powered by AI Study Helper
Making Your Study Simpler
Join us and instantly unlock extensive past papers & exclusive solutions to get a head start on your studies!