题目
题目

BU.520.710.T1.SP25 Quiz 4

单项选择题

Attention Mechanism improves word embedding by:

选项
A.Replacing traditional word embeddings like Word2Vec and GloVe with fixed vector representations
B.Increasing the number of parameters in a neural network without improving contextual understanding
C.Ignoring the sequential nature of text and treating words as independent entities.
D.Dynamically adjusting word embeddings by weighting the relevance of surrounding words in context
查看解析

查看解析

标准答案
Please login to view
思路分析
Examining the concept of attention mechanisms in word embedding requires comparing how each option describes the role of attention in contextual representations. Option 1 argues that attention replaces traditional fixed embeddings like Word2Vec and GloVe with fixed vector representations. In reality, attention mechanisms do not replace fixed embeddings; they operate......Login to view full explanation

登录即可查看完整答案

我们收录了全球超50000道考试原题与详细解析,现在登录,立即获得答案。

类似问题

更多留学生实用工具

加入我们,立即解锁 海量真题独家解析,让复习快人一步!