题目
题目

Learning AI Through Visualization 4 Module 2 Quiz

判断题

The simplex method can outperform gradient descent when the loss function has many local minima.

选项
A.True
B.False
查看解析

查看解析

标准答案
Please login to view
思路分析
The question poses a claim about when the simplex method can outperform gradient descent. Option 1: True. Proponents might argue that the simplex method (e.g., the Nelder-Mead algorithm) is derivative-free and can explore the search space without relying on gradient information. In landscapes with many local minima, a gradient-based method can get trapped in a local basin, especially if gradients vanish or point toward suboptimal d......Login to view full explanation

登录即可查看完整答案

我们收录了全球超50000道考试原题与详细解析,现在登录,立即获得答案。

更多留学生实用工具

加入我们,立即解锁 海量真题独家解析,让复习快人一步!