Questions
Questions

2025FallB-X-CSE571-78760 期末考试 Final Exam

Single choice

假设你正在训练一个神经网络。其中一个神经元具有输入向量 [12, 35, -80, -0.4, 21]、权重 [0.1, -0.6, 0.2, -0.01, 2]、偏置 0.45 和 ReLu 激活函数。以下哪一个是 ReLu 激活函数的输出值? Suppose you are training a neural network. One of the neurons has an input vector [12, 35, -80, -0.4, 21], weights [0.1, -0.6, 0.2, -0.01, 2], bias 0.45, and a ReLu activation function. Which value is the output of the ReLu activation function?

Options
A.-0.54
B.0
C.1
D.6.654
View Explanation

View Explanation

Verified Answer
Please login to view
Step-by-Step Analysis
在这个问题中,我们需要先计算神经元的线性组合再应用 ReLU 激活。 第一步,计算加权和:对每个输入与对应权重相乘后求和。 12 × 0.1 = 1.2; 35 × (-0.6) =......Login to view full explanation

Log in for full answers

We've collected over 50,000 authentic exam questions and detailed explanations from around the globe. Log in now and get instant access to the answers!

More Practical Tools for Students Powered by AI Study Helper

Join us and instantly unlock extensive past papers & exclusive solutions to get a head start on your studies!