题目
题目

Fall 2025.FIN.5321.02 Final Exam

单项选择题

Which innovation is at the core of the transformer architecture and enables modeling long-range dependencies effectively?

选项
A.Recurrent hidden states.
B.The attention mechanism.
C.Backpropagation.
D.Reinforcement learning.
查看解析

查看解析

标准答案
Please login to view
思路分析
The question asks which innovation sits at the core of the transformer architecture and enables modeling long-range dependencies effectively. Option 1: Recurrent hidden states. While recurrent networks rely on hidden states to carry information across timesteps, transformers eschew recurrence entirely. This makes recurrent hidden ......Login to view full explanation

登录即可查看完整答案

我们收录了全球超50000道考试原题与详细解析,现在登录,立即获得答案。

类似问题

更多留学生实用工具

加入我们,立即解锁 海量真题独家解析,让复习快人一步!