题目
题目

11785/11685/11485 Quiz-09

单项选择题

As explained in the lecture, in sequence-to-sequence learning, attention is a probability distribution over Encoder’s input values Encoder’s hidden state values Decoder’s hidden state values Decoder’s input values Encoder’s last state value None of the above (Select all that apply)   Hint: Lec 18, Slide 20-62

查看解析

查看解析

标准答案
Please login to view
思路分析
In sequence-to-sequence learning with attention, the model learns to align each step of the decoder with specific parts of the input sequence processed by the encoder. This alignment is represented as a probability distribution over the encoder-side representations that the decoder can attend to during generation. Because attention weights are comput......Login to view full explanation

登录即可查看完整答案

我们收录了全球超50000道考试原题与详细解析,现在登录,立即获得答案。

类似问题

更多留学生实用工具

加入我们,立即解锁 海量真题独家解析,让复习快人一步!