题目
单项选择题
Question at position 16 Multiple Choice Question: Entropy can be negativeis used to calculate information gainis a measure of information gainis a measure of correlation between numeric variables
选项
A.can be negative
B.is used to calculate information gain
C.is a measure of information gain
D.is a measure of correlation between numeric variables
查看解析
标准答案
Please login to view
思路分析
First, let's break down what entropy means in information theory and how it relates to information gain.
Option 1: 'can be negative' — Entropy is a measure of uncertainty and is defined to be nonnegative; in fact, H(X) ≥ 0 for any random variable X. Therefore, this statement is incorrec......Login to view full explanation登录即可查看完整答案
我们收录了全球超50000道考试原题与详细解析,现在登录,立即获得答案。
类似问题
Consider the following dataset with features X, Y and Z [table] X | Y | Z Jack | 0 | A Jack | 1 | B Amy | 0 | A Amy | 1 | B Sam | 0 | B Sam | 1 | B [/table] Let H denote the entropy function. Which of the following is true?
Consider the following dataset with features X, Y and Z [table] X | Y | Z Jack | 0 | A Jack | 1 | B Amy | 0 | A Amy | 1 | B Sam | 0 | B Sam | 1 | B [/table] Let [math: H] denote the entropy function. Which of the following is true?
Consider the following dataset with features X, Y and Z [table] X | Y | Z Jack | 0 | A Jack | 1 | B Amy | 0 | A Amy | 1 | B Sam | 0 | B Sam | 1 | B [/table] Let H denote the entropy function. Which of the following is true?
Consider the following dataset with features X, Y and Z [table] X | Y | Z 0 | 0 | 0 0 | 1 | 1 1 | 0 | 0 1 | 1 | 1 2 | 0 | 1 2 | 1 | 1 [/table] Compute the entropy of [math: XYZ+1]\frac{XY}{Z+1}
更多留学生实用工具
希望你的学习变得更简单
加入我们,立即解锁 海量真题 与 独家解析,让复习快人一步!