Questions
Questions

Fall 2025.FIN.5321.02 Final Exam

Single choice

Which innovation is at the core of the transformer architecture and enables modeling long-range dependencies effectively?

Options
A.Recurrent hidden states.
B.The attention mechanism.
C.Backpropagation.
D.Reinforcement learning.
View Explanation

View Explanation

Verified Answer
Please login to view
Step-by-Step Analysis
The question asks which innovation sits at the core of the transformer architecture and enables modeling long-range dependencies effectively. Option 1: Recurrent hidden states. While recurrent networks rely on hidden states to carry information across timesteps, transformers eschew recurrence entirely. This makes recurrent hidden ......Login to view full explanation

Log in for full answers

We've collected over 50,000 authentic exam questions and detailed explanations from around the globe. Log in now and get instant access to the answers!

Similar Questions

More Practical Tools for Students Powered by AI Study Helper

Join us and instantly unlock extensive past papers & exclusive solutions to get a head start on your studies!