Questions
Questions

11785/11685/11485 Quiz-09

Single choice

As explained in the lecture, in sequence-to-sequence learning, attention is a probability distribution over Encoder’s input values Encoder’s hidden state values Decoder’s hidden state values Decoder’s input values Encoder’s last state value None of the above (Select all that apply)   Hint: Lec 18, Slide 20-62

View Explanation

View Explanation

Verified Answer
Please login to view
Step-by-Step Analysis
In sequence-to-sequence learning with attention, the model learns to align each step of the decoder with specific parts of the input sequence processed by the encoder. This alignment is represented as a probability distribution over the encoder-side representations that the decoder can attend to during generation. Because attention weights are comput......Login to view full explanation

Log in for full answers

We've collected over 50,000 authentic exam questions and detailed explanations from around the globe. Log in now and get instant access to the answers!

Similar Questions

More Practical Tools for Students Powered by AI Study Helper

Join us and instantly unlock extensive past papers & exclusive solutions to get a head start on your studies!